Human Generated Data

Title

Untitled (three young children sitting on living room floor looking at miniature Christmas tree)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9233

Human Generated Data

Title

Untitled (three young children sitting on living room floor looking at miniature Christmas tree)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Play 99.6
Person 99.3
Human 99.3
Person 98.4
Clothing 97.7
Apparel 97.7
Dress 91.2
Person 90.5
Shorts 87.8
Face 83.7
Kid 79
Child 79
Outdoors 77.5
Furniture 74.4
Helmet 73.7
Table 73.6
Female 72.5
Plant 72.4
Girl 70.7
Portrait 70.1
Photography 70.1
Photo 70.1
Nature 68.6
Meal 67
Food 67
Sphere 64.9
Vase 63.7
Pottery 63.7
Jar 63.7
Potted Plant 63.5
Water 62.6
Baby 58.5
Animal 56.2
Sea 55.9
Ocean 55.9

Imagga
created on 2022-01-23

person 30.1
people 24
man 20.1
sport 19.1
male 17
adult 16.7
planner 14.7
black 13.8
athlete 13.5
world 12.2
fashion 12
portrait 11.6
outdoor 11.5
body 11.2
clothing 11
model 10.9
player 10.7
human 10.5
one 10.4
play 9.5
happy 9.4
lifestyle 9.4
future 9.3
art 9.1
vintage 9.1
headdress 9
fun 9
outdoors 9
style 8.9
hair 8.7
water 8.7
war 8.6
sitting 8.6
face 8.5
active 8.5
summer 8.4
dark 8.3
protection 8.2
symbol 8.1
helmet 8
urban 7.9
bathing cap 7.8
military 7.7
outside 7.7
background 7.6
field 7.5
ball 7.5
city 7.5
negative 7.5
equipment 7.4
park 7.4
sunglasses 7.4
freedom 7.3
group 7.2
sexy 7.2
looking 7.2
women 7.1
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.7
person 90.8
black and white 89.5

Face analysis

Amazon

AWS Rekognition

Age 7-17
Gender Female, 66.4%
Surprised 89.7%
Calm 4.4%
Fear 3.5%
Angry 0.9%
Happy 0.8%
Disgusted 0.3%
Sad 0.3%
Confused 0.1%

AWS Rekognition

Age 7-17
Gender Female, 99.3%
Calm 97.6%
Happy 0.9%
Sad 0.8%
Surprised 0.2%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%
Fear 0%

Feature analysis

Amazon

Person 99.3%
Helmet 73.7%

Captions

Microsoft

a group of people jumping in the air 75.5%
a group of people in front of a window 74.7%
a person doing a trick on a skateboard 28.3%

Text analysis

Amazon

VI72A2
yogor