Human Generated Data

Title

Untitled (family portrait in living room)

Date

1945

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21919

Human Generated Data

Title

Untitled (family portrait in living room)

People

Artist: Hamblin Studio, American active 1930s

Date

1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21919

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 96.2
Human 96.2
Person 95.7
Person 93.1
Person 89.4
Person 84.9
Person 83.7
Furniture 83.1
Room 73.6
Indoors 73.6
Costume 73.5
People 73.2
Nature 69.3
Outdoors 66.7
Clothing 61.6
Apparel 61.6
Toy 61.2
Living Room 59.6
Person 48.8

Clarifai
created on 2023-10-22

people 99.9
group 99.4
many 97.2
adult 96.2
child 95.2
man 93.9
group together 93.6
education 91.7
school 91.4
military 89.8
woman 89.2
boy 86.8
wear 86.3
dancing 86.1
war 85.8
leader 85.6
interaction 84.9
several 83.3
soldier 82.8
art 81.9

Imagga
created on 2022-03-11

grunge 35.8
old 22.3
dirty 21.7
texture 20.8
vintage 20.1
black 18
snow 17.8
art 17.3
structure 17.2
aged 17.2
grungy 17.1
antique 16.8
decoration 15.2
graffito 14.9
design 14.6
fountain 14.6
paint 14.5
pattern 14.4
landscape 14.1
wall 13.7
drawing 13.6
dark 12.5
architecture 12.5
frame 12.5
silhouette 12.4
retro 12.3
city 11.6
park 11.5
building 11.5
negative 11.4
light 11.4
forest 11.3
rough 10.9
man 10.8
weather 10.8
tree 10.8
wallpaper 10.7
surface 10.6
autumn 10.5
textured 10.5
artistic 10.4
brown 9.6
rock 9.6
ancient 9.5
weathered 9.5
winter 9.4
space 9.3
splash 9
style 8.9
mountain 8.9
urban 8.7
mist 8.7
stain 8.6
paper 8.6
worn 8.6
season 8.6
film 8.5
chandelier 8.5
people 8.4
border 8.1
sketch 8.1
painting 8.1
sun 8
material 8
graphic 8
night 8
messy 7.7
fog 7.7
sport 7.4
water 7.3
peaceful 7.3
color 7.2
creative 7.1
travel 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

drawing 96.6
text 96.1
outdoor 89.6
person 85.5
sketch 84.1
black and white 80.3
cartoon 74.9
old 44.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 95.8%
Confused 2.3%
Surprised 0.9%
Sad 0.7%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 99.6%
Sad 0.3%
Happy 0.1%
Confused 0%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 96.6%
Calm 99.4%
Sad 0.4%
Confused 0.1%
Surprised 0.1%
Happy 0%
Fear 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 33-41
Gender Female, 53.7%
Happy 98.7%
Surprised 1%
Calm 0.1%
Sad 0.1%
Fear 0%
Angry 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 41-49
Gender Male, 89.9%
Calm 91.8%
Confused 3.3%
Surprised 1.5%
Angry 1.3%
Happy 0.9%
Sad 0.6%
Disgusted 0.5%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 96.2%
Person 95.7%
Person 93.1%
Person 89.4%
Person 84.9%
Person 83.7%
Person 48.8%

Text analysis

Amazon

a
YY33A2
4334
МЭТА YY33A2 4334
МЭТА