Human Generated Data

Title

Untitled (bride seated in living room with wedding guests)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11661

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (bride seated in living room with wedding guests)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 98.7
Person 98.4
Person 98.3
Apparel 97.9
Clothing 97.9
Sitting 80.2
Person 76.7
People 69.6
Text 69.1
Furniture 64
Photography 63.9
Photo 63.9
Face 62.7
Portrait 62.7
Chair 58.5
Hat 58.4
Advertisement 57.6
Crowd 56.8
Musician 56.5
Musical Instrument 56.5
Person 50.1

Imagga
created on 2022-01-15

wind instrument 49.3
brass 47.6
cornet 35.3
person 34.9
man 34.3
musical instrument 34
male 34
people 34
businessman 29.1
room 28
business 27.3
professional 27.1
oboe 26.2
adult 25.7
men 24.9
office 24.1
group 23.4
teacher 19.1
corporate 18.9
meeting 18.8
happy 17.5
couple 17.4
team 17
woodwind 16.9
home 15.9
student 15.8
indoors 15.8
businesswoman 15.4
desk 15.1
job 15
women 15
executive 14.9
table 14.7
indoor 14.6
worker 14.3
businesspeople 14.2
work 14.1
together 14
classroom 13.8
smiling 13.7
computer 13.6
communication 13.4
teamwork 13
manager 12.1
success 12.1
sitting 12
modern 11.9
suit 11.7
interior 11.5
senior 11.2
mature 11.2
happiness 11
laptop 10.9
handsome 10.7
colleagues 10.7
family 10.7
talking 10.5
education 10.4
lifestyle 10.1
smile 10
board 9.9
chair 9.8
conference 9.8
working 9.7
new 9.7
medical 9.7
educator 9.4
study 9.3
finance 9.3
employee 8.7
casual 8.5
life 8.5
successful 8.2
cheerful 8.1
flute 8
diverse 7.8
hands 7.8
class 7.7
attractive 7.7
diversity 7.7
hand 7.6
enrollee 7.5
holding 7.4
technology 7.4
occupation 7.3
nurse 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.6
person 95.3
clothing 94.6
man 68.5
sport 67.2

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.5%
Calm 46.9%
Happy 27.8%
Sad 14.6%
Angry 3.7%
Confused 2.6%
Surprised 2.3%
Disgusted 1.4%
Fear 0.8%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 93.8%
Surprised 3.6%
Confused 0.8%
Sad 0.7%
Angry 0.3%
Disgusted 0.3%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 86.6%
Calm 95.1%
Surprised 1.9%
Sad 1.5%
Happy 0.4%
Angry 0.4%
Confused 0.4%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 39-47
Gender Female, 76.5%
Calm 90.5%
Surprised 3.7%
Fear 2.2%
Happy 1.4%
Sad 1.1%
Confused 0.5%
Disgusted 0.4%
Angry 0.2%

AWS Rekognition

Age 45-51
Gender Male, 96.9%
Calm 98%
Sad 1.1%
Fear 0.3%
Surprised 0.2%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 58.5%

Captions

Microsoft

a group of people in a room 91.1%
a group of people sitting in chairs 81.6%
a group of people sitting in a chair 78.2%

Text analysis

Amazon

9350.
YT37AS
A70A
M.IU YT37AS A70A
M.IU

Google

9350.
935
9350
935 9350. 9350