Human Generated Data

Title

Untitled (group portrait of men and women with teacups gathered around woman pouring tea)

Date

1950

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10837

Human Generated Data

Title

Untitled (group portrait of men and women with teacups gathered around woman pouring tea)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.6
Human 99.6
Person 99.1
Person 98.7
Person 98.7
People 97.6
Person 97.4
Person 97.1
Person 96.8
Person 96.8
Person 96.4
Person 95.4
Family 84.9
Tie 76.3
Accessories 76.3
Accessory 76.3
Person 67.9
Photo 61.7
Photography 61.7
Person 60.9
Room 60.5
Indoors 60.5

Imagga
created on 2022-01-29

room 40.1
male 36.2
man 34.9
person 34.7
people 34.6
meeting 33.9
businessman 32.7
group 32.2
business 32.2
office 30.5
table 30.3
adult 30
classroom 30
teacher 29.4
men 29.2
professional 28.8
women 28.5
kin 27.8
team 27.8
businesswoman 27.3
sitting 25.8
smiling 25.3
together 24.5
happy 24.4
teamwork 24.1
work 23.5
executive 21.7
cheerful 20.3
couple 20
corporate 19.8
laptop 19.1
talking 19
businesspeople 19
communication 18.5
indoors 18.5
job 17.7
colleagues 17.5
worker 17
nurse 16.9
manager 16.8
presentation 16.8
portrait 16.2
smile 15.7
educator 15.7
conference 15.6
suit 15.3
desk 15.1
happiness 14.9
successful 14.6
home 13.6
boss 13.4
lifestyle 13
education 13
success 12.9
life 12.7
modern 12.6
holding 12.4
working 12.4
indoor 11.9
confident 10.9
student 10.9
casual clothing 10.8
discussion 10.7
leader 10.6
color 10.6
workplace 10.5
mature 10.2
teaching 9.7
new 9.7
class 9.6
chair 9.6
ethnic 9.5
togetherness 9.4
friends 9.4
finance 9.3
employee 9.3
board 9
school 8.9
family 8.9
interior 8.8
coworkers 8.8
computer 8.8
diverse 8.8
looking 8.8
40s 8.8
partners 8.7
love 8.7
plan 8.5
two 8.5
enjoyment 8.4
senior 8.4
friendship 8.4
coffee 8.3
training 8.3
mother 8.2
hall 8
20 24 years 7.9
seminar 7.9
have 7.8
boy 7.8
standing 7.8
face 7.8
mid adult 7.7
diversity 7.7
four 7.7
drinking 7.7
house 7.5
ideas 7.5
study 7.5
company 7.4
blackboard 7.3
20s 7.3
child 7.2
handsome 7.1

Google
created on 2022-01-29

Photograph 94.2
Black 89.7
Dress 85.8
Chair 83.5
Monochrome 76.8
Vintage clothing 73.6
Event 73.4
Classic 71.8
Monochrome photography 71
Font 69.1
Room 66.1
Stock photography 65.3
Suit 64.8
Sitting 64.4
Photo caption 62.6
History 62.3
Table 61.1
Art 60.8
Couch 56.2
Family 53.3

Microsoft
created on 2022-01-29

person 94.6
music 89.4
text 80.9
clothing 78.5
white 69.8
black 68.4
table 58.4
old 57.1

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Happy 82.5%
Calm 11.5%
Surprised 3.2%
Sad 1.1%
Fear 0.5%
Angry 0.5%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 49-57
Gender Male, 98.8%
Calm 59.9%
Sad 13.8%
Fear 7.3%
Happy 7%
Confused 4.2%
Surprised 3.6%
Disgusted 2.4%
Angry 1.9%

AWS Rekognition

Age 49-57
Gender Male, 98.7%
Calm 51.3%
Happy 42.1%
Sad 4.7%
Surprised 0.5%
Confused 0.4%
Disgusted 0.4%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 39-47
Gender Male, 99.5%
Calm 99.2%
Happy 0.5%
Surprised 0.2%
Confused 0.1%
Sad 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 98%
Sad 0.4%
Fear 0.3%
Confused 0.3%
Surprised 0.3%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 40-48
Gender Male, 84.9%
Calm 94.1%
Happy 2%
Sad 1.4%
Confused 1.4%
Surprised 0.3%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Female, 78.9%
Happy 87.9%
Calm 8.8%
Surprised 1.8%
Sad 0.6%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 98.9%
Calm 99.5%
Surprised 0.3%
Angry 0.1%
Confused 0%
Sad 0%
Disgusted 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.6%
Tie 76.3%

Captions

Microsoft

a vintage photo of a group of people standing in front of a piano 94.3%
a group of people standing in front of a piano 94.2%
a vintage photo of a group of people in front of a piano 94.1%