Human Generated Data

Title

Untitled (woman in furs at hunt, Maryland Hunt Cup Race, Maryland)

Date

1939, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.311

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman in furs at hunt, Maryland Hunt Cup Race, Maryland)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.1
Person 99.1
Clothing 98.7
Apparel 98.7
Person 98.6
Person 98.1
Person 97.1
Coat 96.7
Person 96.7
Person 96.6
Person 96.5
Person 92.4
Person 91.7
Person 88.3
Person 87.3
Euphonium 86.3
Musical Instrument 86.3
Tuba 86.3
Horn 86.3
Brass Section 86.3
Hat 77.9
Musician 69.8
Music Band 69.8
Overcoat 67.4
Crowd 66.2
People 65.9
Footwear 59.8
Shoe 59.8
Face 59

Imagga
created on 2022-01-08

uniform 100
military uniform 100
clothing 96.7
consumer goods 67.3
covering 65.9
commodity 33.6
man 32.2
male 26.2
military 24.1
people 22.3
person 21.7
soldier 19.5
weapon 19.4
war 19.3
army 18.5
gun 17
helmet 15.6
outdoor 15.3
adult 14.2
battle 13.7
history 13.4
men 12.9
warrior 12.7
horse 12.3
rifle 12.2
outdoors 11.9
danger 11.8
combat 11.8
fight 10.6
old 10.4
hat 10.2
private 10.2
sport 9.9
portrait 9.7
together 9.6
travel 9.1
protection 9.1
family 8.9
happy 8.8
boy 8.7
animal 8.6
two 8.5
summer 8.4
sky 8.3
to 8
day 7.8
smile 7.8
conflict 7.8
outside 7.7
industry 7.7
statue 7.6
walking 7.6
child 7.5
fire 7.5
industrial 7.3
active 7.2
building 7.1
job 7.1
work 7.1

Google
created on 2022-01-08

Clothing 98.7
Footwear 98.1
Trousers 96.2
Outerwear 95.1
Photograph 94.2
Hat 93.7
Coat 92.8
Sun hat 87.5
Plant 81.8
Suit 81.7
Headgear 81.1
Musical instrument 81
Tree 77.8
Military person 75.2
Event 73.4
Vintage clothing 73.2
Monochrome 72.7
Fedora 71.9
Crew 69.9
Monochrome photography 69.8

Microsoft
created on 2022-01-08

person 100
outdoor 99.9
sky 99.7
grass 99.6
clothing 97.7
standing 92.3
man 91.4
text 87.7
group 77.5
black and white 73.6
people 73.4
white 68.9
old 44.9
crowd 2.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 99.3%
Happy 99.9%
Calm 0%
Angry 0%
Surprised 0%
Sad 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 98.2%
Sad 1.2%
Happy 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Calm 99.3%
Happy 0.3%
Surprised 0.1%
Angry 0%
Sad 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 86.7%
Calm 95.5%
Angry 2.2%
Surprised 0.8%
Confused 0.6%
Sad 0.5%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Sad 97.5%
Calm 1.9%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 23-33
Gender Female, 99.3%
Happy 47.7%
Calm 29.3%
Disgusted 9.8%
Confused 4.8%
Angry 2.5%
Surprised 2.4%
Sad 1.8%
Fear 1.7%

AWS Rekognition

Age 48-56
Gender Female, 100%
Sad 53.9%
Surprised 8.3%
Calm 8%
Happy 8%
Confused 6.6%
Angry 6.2%
Disgusted 5.9%
Fear 3.1%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 51.6%
Happy 20.6%
Sad 7.8%
Fear 6.3%
Angry 5.3%
Surprised 5.1%
Disgusted 1.8%
Confused 1.5%

AWS Rekognition

Age 23-33
Gender Male, 99.6%
Fear 69.8%
Sad 9.6%
Surprised 5.1%
Happy 4.5%
Calm 4.5%
Disgusted 3.6%
Angry 1.9%
Confused 0.9%

AWS Rekognition

Age 19-27
Gender Female, 72.8%
Sad 99.9%
Calm 0%
Fear 0%
Angry 0%
Disgusted 0%
Confused 0%
Happy 0%
Surprised 0%

Microsoft Cognitive Services

Age 43
Gender Female

Microsoft Cognitive Services

Age 54
Gender Female

Microsoft Cognitive Services

Age 55
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.1%
Coat 96.7%
Hat 77.9%
Shoe 59.8%

Captions

Microsoft

a group of people standing in a field 98.2%
a group of people that are standing in the grass 97.2%
a group of people standing in the grass 97.1%