Human Generated Data

Title

Untitled (eight family members posed looking at girl crouched edge of couch in living room with christmas tree to one side)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6516

Human Generated Data

Title

Untitled (eight family members posed looking at girl crouched edge of couch in living room with christmas tree to one side)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Apparel 99
Clothing 99
Person 98.8
Human 98.8
Person 97.9
Person 97.6
Person 97.3
Person 95.4
Person 94.6
Furniture 92.2
Stage 91.9
Person 91.8
Sitting 79.8
Robe 76.8
Fashion 76.8
Gown 76.1
People 75.4
Person 73.8
Coat 71.9
Overcoat 71.9
Floor 71.2
Dress 69.2
Female 69
Suit 67.3
Wedding 66.3
Indoors 66
Flooring 62.2
Wedding Gown 59.5
Evening Dress 58.7
Living Room 57.7
Room 57.7
Crowd 57
Couch 56.9
Woman 55.7

Clarifai
created on 2019-03-25

people 99.9
group 99.6
group together 99.1
adult 98.4
leader 98.1
many 98.1
woman 97.4
man 97.2
administration 96.9
child 92
several 91.4
wear 91
five 89.8
music 87.3
furniture 87.1
sit 86.4
outfit 86.2
room 85
ceremony 84.3
recreation 83.4

Imagga
created on 2019-03-25

man 36.4
marimba 33.6
people 33.4
musical instrument 32.4
male 29.8
percussion instrument 29.6
person 28.7
businessman 23.8
women 23.7
business 23.7
men 23.2
adult 21.7
sitting 20.6
couple 20
teacher 19.3
group 17.7
happy 17.5
office 16.2
lifestyle 15.9
executive 15.1
meeting 15.1
indoors 14.9
portrait 14.9
professional 14.7
groom 14.5
team 14.3
room 13.8
two 13.5
love 13.4
together 13.1
stringed instrument 13
archive 12.6
interior 12.4
smiling 12.3
table 12.1
kin 12.1
happiness 11.7
smile 11.4
cheerful 11.4
corporate 11.2
educator 11.1
suit 10.8
friends 10.3
manager 10.2
relaxation 10
copy space 9.9
holding 9.9
job 9.7
home 9.6
mature 9.3
inside 9.2
leisure 9.1
indoor 9.1
silhouette 9.1
businesswoman 9.1
old 9.1
fun 9
handsome 8.9
family 8.9
night 8.9
work 8.6
day 8.6
performer 8.6
wall 8.5
togetherness 8.5
desk 8.5
casual 8.5
senior 8.4
modern 8.4
student 8.3
worker 8.3
success 8
romantic 8
device 7.8
education 7.8
husband 7.8
full length 7.8
child 7.7
youth 7.7
businesspeople 7.6
adults 7.6
communication 7.6
dancer 7.5
teamwork 7.4
positive 7.4
aged 7.2
grandfather 7.2
looking 7.2
life 7.1
secretary 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

floor 94.8
indoor 87.5
ballet 87.5
wedding 7.5
person 4.1

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Male, 53.3%
Disgusted 45.3%
Calm 45.7%
Sad 47.5%
Confused 45.2%
Happy 49.8%
Surprised 45.4%
Angry 46.1%

AWS Rekognition

Age 26-43
Gender Male, 53%
Happy 45.2%
Disgusted 45.3%
Calm 46.9%
Sad 51.8%
Surprised 45.1%
Confused 45.2%
Angry 45.6%

AWS Rekognition

Age 35-52
Gender Male, 51.3%
Calm 52.3%
Happy 45.2%
Confused 45.1%
Disgusted 45.2%
Sad 46.8%
Surprised 45.2%
Angry 45.2%

AWS Rekognition

Age 38-57
Gender Male, 53.6%
Calm 50.7%
Happy 45.5%
Sad 47.2%
Angry 45.8%
Disgusted 45.2%
Confused 45.3%
Surprised 45.3%

AWS Rekognition

Age 29-45
Gender Male, 54.7%
Angry 45.2%
Sad 53.1%
Surprised 45.1%
Confused 45.2%
Calm 46.1%
Happy 45.2%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Male, 51.1%
Surprised 45.4%
Calm 47.6%
Disgusted 45.3%
Happy 45.6%
Angry 45.5%
Sad 50.4%
Confused 45.2%

AWS Rekognition

Age 30-47
Gender Male, 50%
Happy 46.4%
Sad 46.6%
Surprised 45.5%
Disgusted 45.4%
Angry 45.6%
Calm 50.2%
Confused 45.4%

Feature analysis

Amazon

Person 98.8%
Suit 67.3%

Captions

Microsoft

a group of people standing in a room 93.8%
a group of people in a room 93.7%
a group of people that are standing in a room 90.5%

Text analysis

Amazon

-XAOX
YT33A2
-XAOX ISE YT33A2
ISE