Human Generated Data

Title

Untitled (wedding guests at a table under a tent)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8527

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests at a table under a tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99
Person 98.8
Person 98.4
Hat 97.8
Clothing 97.8
Apparel 97.8
Person 97.3
Person 97.2
Person 96.9
Suit 80.1
Overcoat 80.1
Coat 80.1
People 73.4
Person 72
Person 70.5
Face 59.9

Imagga
created on 2022-01-09

man 30.2
people 24
radio telescope 21.5
kin 20.9
male 20.5
astronomical telescope 17.2
adult 17
person 15.4
professional 13
telescope 12.8
couple 12.2
black 12
work 11.9
spectator 11.8
portrait 11.6
world 11.4
happy 10.6
magnifier 10.5
business 10.3
men 10.3
family 9.8
smiling 9.4
life 9.2
worker 9.1
old 9
medical 8.8
businessman 8.8
together 8.8
surgeon 8.5
casual 8.5
hat 8.5
doctor 8.5
senior 8.4
team 8.1
love 7.9
hospital 7.7
health 7.6
nurse 7.6
outdoors 7.5
occupation 7.3
celebration 7.2
patient 7.2
smile 7.1
job 7.1
happiness 7
medicine 7
sky 7
modern 7

Google
created on 2022-01-09

Black 89.8
Black-and-white 87.9
Hat 85.4
Style 84.1
Font 80.6
Monochrome 80.2
Adaptation 79.3
Monochrome photography 79.1
Snapshot 74.3
Art 74.1
Event 72.2
Crew 72.2
T-shirt 72.1
Design 68.3
Cooking 66.9
Room 66.1
Stock photography 65.5
Team 64.8
Illustration 63.9
Crowd 63.5

Microsoft
created on 2022-01-09

person 99.6
clothing 96.3
text 95.9
man 91.3
black and white 85.4
standing 80.4
human face 65.8
group 59.2
posing 54

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 95.1%
Calm 99.4%
Happy 0.2%
Sad 0.1%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 98.8%
Calm 39.2%
Surprised 37.4%
Happy 15.5%
Disgusted 2.9%
Sad 2.1%
Confused 1.7%
Angry 0.8%
Fear 0.4%

AWS Rekognition

Age 37-45
Gender Male, 70.9%
Calm 99.8%
Angry 0.1%
Sad 0%
Surprised 0%
Confused 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Female, 78.2%
Calm 83.9%
Happy 12%
Sad 2.1%
Confused 0.7%
Fear 0.4%
Disgusted 0.4%
Surprised 0.3%
Angry 0.2%

AWS Rekognition

Age 40-48
Gender Male, 90.4%
Sad 62.4%
Calm 28.2%
Surprised 2.9%
Angry 2%
Confused 1.5%
Fear 1.4%
Disgusted 1%
Happy 0.6%

AWS Rekognition

Age 24-34
Gender Female, 82.8%
Calm 63.5%
Surprised 26.5%
Angry 3.3%
Happy 2.6%
Fear 1.6%
Disgusted 1.4%
Sad 0.7%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Hat 97.8%

Captions

Microsoft

a group of people posing for a photo 96.3%
a group of people posing for the camera 96.2%
a group of people posing for a picture 96.1%

Text analysis

Amazon

17532.

Google

.-2 5רו בב 5 רןונ
.-2
5רו
בב
5
רןונ