Human Generated Data

Title

Untitled (group portrait of men and women with teacups gathered around woman pouring tea)

Date

1950

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10835

Human Generated Data

Title

Untitled (group portrait of men and women with teacups gathered around woman pouring tea)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 99.4
Person 99.4
Person 99.1
Person 97.3
Person 96.2
People 96.2
Person 93.3
Person 93
Person 92.7
Person 89.1
Person 89
Tie 82.5
Accessories 82.5
Accessory 82.5
Person 82.4
Person 74.3
Face 70.4
Photography 64
Photo 64
Crowd 62.7
Female 61.4
Apparel 60.4
Clothing 60.4
Portrait 60.4
Family 58.9
Table 58.7
Furniture 58.7
Girl 58.3

Imagga
created on 2022-01-29

nurse 68.8
man 34.3
male 32.7
adult 30.8
people 30.7
person 30
home 25.5
smiling 24.6
happy 23.8
happiness 21.9
teacher 21.9
room 21.9
men 21.5
couple 20.9
child 20.5
cheerful 20.3
women 19.8
businessman 19.4
indoors 18.4
business 18.2
family 17.8
kin 17.7
together 17.5
lifestyle 17.3
professional 16.9
businesswoman 16.4
life 15.6
30s 15.4
love 15
mother 15
group 14.5
blackboard 14.4
smile 14.2
school 14
boy 13.9
education 13.8
office 13.6
portrait 13.6
classroom 13.4
meeting 13.2
patient 13.1
mature 13
20s 12.8
casual 12.7
two 12.7
colleagues 12.6
husband 12.6
team 12.5
holding 12.4
businesspeople 11.4
sitting 11.2
house 10.9
attractive 10.5
talking 10.4
wife 10.4
togetherness 10.4
senior 10.3
work 10.2
clothing 10.2
camera 10.2
student 10.1
kid 9.7
40s 9.7
to 9.7
fun 9.7
working 9.7
success 9.7
daughter 9.3
educator 9.3
father 9.2
care 9
hospital 9
color 8.9
job 8.8
interior 8.8
medical 8.8
looking 8.8
couch 8.7
mid adult 8.7
chair 8.6
elderly 8.6
reading 8.6
friends 8.5
executive 8.3
relaxing 8.2
brunette 7.8
face 7.8
table 7.8
health 7.6
females 7.6
laughing 7.6
parent 7.5
teamwork 7.4
inside 7.4
children 7.3
new 7.3
board 7.2
worker 7.2
handsome 7.1
modern 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

birthday cake 97
text 95.1
person 91.2
clothing 80.9
wedding cake 76.5
table 75.6
woman 69.3
cake 68
food 65.4
white 65.4
man 56.7
old 42.8

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 84.4%
Happy 65.8%
Calm 11.7%
Sad 9.9%
Surprised 4%
Angry 3.9%
Disgusted 2.3%
Confused 1.3%
Fear 1.2%

AWS Rekognition

Age 38-46
Gender Male, 99.3%
Sad 39.4%
Calm 25.7%
Confused 19.9%
Happy 10%
Disgusted 1.8%
Surprised 1.2%
Angry 1.1%
Fear 1%

AWS Rekognition

Age 31-41
Gender Male, 87.1%
Calm 80.9%
Sad 14.6%
Happy 2.4%
Angry 1.1%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 98.5%
Happy 85.3%
Calm 12.8%
Surprised 0.6%
Sad 0.5%
Disgusted 0.4%
Confused 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Male, 99.3%
Calm 63.1%
Sad 20.3%
Happy 6%
Surprised 4.1%
Confused 2%
Disgusted 2%
Fear 1.4%
Angry 1.2%

AWS Rekognition

Age 37-45
Gender Male, 98.9%
Calm 96%
Sad 1.9%
Happy 0.6%
Confused 0.5%
Surprised 0.3%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 51-59
Gender Male, 72.3%
Calm 91.4%
Happy 5.5%
Sad 1.2%
Confused 0.8%
Disgusted 0.4%
Angry 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Female, 75.9%
Calm 45.2%
Sad 31.5%
Happy 16.4%
Confused 3.8%
Surprised 1.1%
Fear 0.7%
Angry 0.6%
Disgusted 0.6%

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Sad 56%
Calm 41.9%
Confused 1%
Angry 0.4%
Happy 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 82.5%

Captions

Microsoft

a group of people posing for a photo 88.6%
a group of people posing for the camera 88.5%
a group of people posing for a picture 88.4%

Text analysis

Amazon

81A
KODAK-

Google

YT37A2-YAGO
YT37A2-YAGO