Human Generated Data

Title

Untitled (men from the navy helping woman onto boat)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7114

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men from the navy helping woman onto boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 100
Sailor Suit 100
Person 99.8
Person 99.7
Person 99.6
Person 99.6
Person 99.4
Person 99.3
Apparel 98.9
Clothing 98.9
Shorts 98.8
Military 98.4
Person 97.5
Hat 94.4
Shoe 92.6
Footwear 92.6
Military Uniform 90.2
Officer 90.2
Shoe 88.4
Navy 87.1
Crowd 80.9
Shoe 75.2
Face 62.4
Photography 62.4
Photo 62.4
Portrait 62.4
Parade 60.4
Soldier 60
People 58.5
Shoe 56.3
Cap 55.6
Person 44.1

Imagga
created on 2021-12-15

ballplayer 65.9
player 58.2
athlete 57.3
uniform 56.5
person 48.9
contestant 40
man 38.9
military uniform 37.7
male 36.1
people 26.8
clothing 26.2
professional 23.7
adult 20.1
sport 18.8
men 18
occupation 17.4
standing 17.4
handsome 16.9
portrait 16.8
job 16.8
happy 15.7
consumer goods 15.3
covering 15.1
doctor 15
outdoors 14.9
helmet 14.8
hat 14.7
smile 14.2
worker 14.2
medical 14.1
nurse 13.7
planner 13.7
equipment 13.4
smiling 13
looking 12
safety 12
work 11.8
team 11.6
profession 11.5
black 11.4
hospital 11.3
active 11
holding 10.7
military 10.6
senior 10.3
teamwork 10.2
health 9.7
working 9.7
business 9.7
one 9.7
emergency 9.6
stethoscope 9.6
day 9.4
two 9.3
mature 9.3
fun 9
healthy 8.8
medicine 8.8
lifestyle 8.7
war 8.7
coat 8.6
student 8.3
leisure 8.3
park 8.2
care 8.2
confident 8.2
aviator 8.1
activity 8.1
game 8
to 8
pilot 7.8
boy 7.8
soldier 7.8
commodity 7.6
human 7.5
fire 7.5
protection 7.3
danger 7.3
industrial 7.3
travel 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.2
person 99.2
outdoor 98.7
clothing 95.2
ship 91
man 89.7
player 77.1
black and white 59.5

Face analysis

Amazon

Google

AWS Rekognition

Age 37-55
Gender Male, 92.6%
Surprised 34.3%
Fear 31.8%
Calm 12.9%
Confused 10.7%
Angry 5.1%
Sad 3.7%
Happy 0.8%
Disgusted 0.6%

AWS Rekognition

Age 32-48
Gender Female, 57.4%
Calm 85.1%
Confused 4.4%
Surprised 4.1%
Sad 2.3%
Angry 1.7%
Happy 1%
Disgusted 0.7%
Fear 0.6%

AWS Rekognition

Age 39-57
Gender Male, 97.1%
Calm 97.4%
Sad 1%
Confused 0.8%
Surprised 0.4%
Angry 0.2%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 27-43
Gender Female, 53.6%
Sad 61.3%
Calm 30.7%
Fear 2.5%
Angry 2.1%
Happy 1.2%
Confused 1%
Surprised 0.9%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 94.4%
Shoe 92.6%

Captions

Microsoft

a group of people posing for the camera 76.1%
a group of people posing for a photo 68.3%
a group of people standing in front of a crowd 68.2%

Text analysis

Amazon

6
19615.
19513.

Google

19613. 19615,
19613.
19615,