Human Generated Data

Title

Untitled (group portrait of men and women with teacups gathered around woman pouring tea)

Date

1950

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10836

Human Generated Data

Title

Untitled (group portrait of men and women with teacups gathered around woman pouring tea)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99.7
Person 99.7
Person 98.8
Person 98.7
Person 98.6
Person 98.5
Person 98
Person 97.2
People 96.8
Person 96.7
Person 95.3
Person 91
Person 82
Family 78.6
Accessories 74.2
Tie 74.2
Accessory 74.2
Crowd 60.5
Clothing 57.1
Apparel 57.1
Jury 55.2

Imagga
created on 2022-01-29

teacher 43.4
classroom 42.2
room 40.6
man 40.3
person 37.9
male 36.9
people 32.9
business 32.2
adult 31.5
businessman 30.9
professional 30.8
meeting 29.2
group 29
table 28.6
office 27.3
smiling 26
men 24.9
businesswoman 24.5
team 24.2
sitting 24.1
women 23.7
marimba 23.5
educator 23
happy 21.9
percussion instrument 20.2
blackboard 20
talking 20
student 19.7
teamwork 19.5
colleagues 19.4
indoors 19.3
musical instrument 19.1
businesspeople 19
work 18.8
communication 18.5
corporate 18
education 17.3
cheerful 17.1
desk 17
executive 17
together 16.6
lifestyle 16.6
couple 16.6
nurse 15.9
laptop 15.5
worker 15.2
senior 15
school 14.3
mature 14
successful 13.7
portrait 13.6
home 13.6
working 13.3
manager 13
confident 12.7
happiness 12.5
job 12.4
kin 12.3
teaching 11.7
smile 11.4
presentation 11.2
discussing 10.8
suit 10.8
conference 10.8
discussion 10.7
class 10.6
four 10.5
modern 10.5
workplace 10.5
standing 10.4
chair 10.4
looking 10.4
indoor 10
holding 9.9
diverse 9.8
content 9.7
technology 9.6
30s 9.6
elderly 9.6
adults 9.5
color 9.5
togetherness 9.4
learning 9.4
company 9.3
board 9
new 8.9
success 8.9
interior 8.8
coworkers 8.8
computer 8.8
casual clothing 8.8
employee 8.8
two people 8.7
leader 8.7
drinking 8.6
boss 8.6
friends 8.5
friendship 8.4
ideas 8.4
study 8.4
camera 8.3
life 7.9
explaining 7.9
20 24 years 7.9
associates 7.9
30 35 years 7.9
leisure activity 7.8
boy 7.8
40s 7.8
partners 7.8
grandfather 7.7
diversity 7.7
two 7.6
gesture 7.6
ethnic 7.6
enjoyment 7.5
coffee 7.4
training 7.4
20s 7.3
kid 7.1
to 7.1
day 7.1
child 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

birthday cake 94.3
text 93.6
person 91.3
music 89.8
wedding cake 81.6
candle 60.8
table 60.4
clothing 58.5
old 57.2
cake 56.9
wedding dress 51.7
posing 40.5
clothes 15

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Happy 73.9%
Calm 19.4%
Sad 2.6%
Surprised 1.6%
Angry 1.1%
Disgusted 0.7%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 50-58
Gender Male, 99.9%
Calm 86.2%
Sad 6.5%
Happy 2.5%
Surprised 2%
Confused 1%
Angry 0.7%
Disgusted 0.6%
Fear 0.5%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 90.4%
Happy 7.1%
Surprised 1.7%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 98.3%
Surprised 41.8%
Happy 26.1%
Calm 25.3%
Fear 2%
Sad 1.9%
Disgusted 1.1%
Angry 1%
Confused 0.6%

AWS Rekognition

Age 21-29
Gender Male, 63.6%
Surprised 38.3%
Calm 30.7%
Happy 20.9%
Sad 6%
Disgusted 1.3%
Fear 1.2%
Angry 1.1%
Confused 0.7%

AWS Rekognition

Age 29-39
Gender Male, 98.6%
Calm 99.4%
Confused 0.1%
Sad 0.1%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Calm 61.7%
Confused 11.3%
Fear 6.6%
Surprised 6.2%
Sad 6.1%
Angry 3.3%
Disgusted 3.1%
Happy 1.6%

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Sad 78%
Calm 13.5%
Happy 3.1%
Confused 3%
Angry 0.8%
Disgusted 0.7%
Surprised 0.6%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 74.2%

Captions

Microsoft

a group of people standing in front of a piano 96.2%
a vintage photo of a group of people standing in front of a piano 92.8%
a group of people posing for a photo in front of a piano 92.7%

Text analysis

Amazon

ODYK
oil
Y03942