Human Generated Data

Title

Untitled (group of women raising arms in exercise)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19725

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of women raising arms in exercise)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Person 99
Flag 98.4
Symbol 98.4
Apparel 96.7
Shorts 96.7
Clothing 96.7
Chair 96.5
Furniture 96.5
Crowd 93.4
Person 88.7
Female 84.4
Audience 77.2
People 72.1
Parade 71
Girl 67.5
American Flag 67.4
Child 65.2
Kid 65.2
Face 65.1
Woman 61.8
Table 61.4
Photo 61.4
Photography 61.4
Portrait 61.4
Text 61

Imagga
created on 2022-03-05

seller 25
shop 19.5
man 16.8
people 13.9
hanger 13.1
shoe shop 12.4
male 12.1
coat hanger 11.4
vehicle 11.1
clothing 10.6
mercantile establishment 10.4
person 10.4
car 10.2
support 9.9
men 9.4
holiday 9.3
work 9.1
equipment 8.9
steel 8.8
lifestyle 8.7
travel 8.4
adult 8.4
shovel 8.4
old 8.4
transport 8.2
tool 8.2
metal 8
boat 7.7
automobile 7.7
stage 7.6
fashion 7.5
happy 7.5
wood 7.5
industrial 7.3
smile 7.1
interior 7.1
place of business 7

Google
created on 2022-03-05

Photograph 94.2
White 92.2
Shorts 92.1
Black 89.8
Black-and-white 86.7
Style 84.1
Hat 81.5
Monochrome photography 76
Monochrome 75.3
Snapshot 74.3
Event 74
T-shirt 73.2
Stock photography 64.9
Room 61.2
Entertainment 59.3
Public event 56.3
Crowd 55.3
Fish 54.3
Photography 51.7
Crew 51.2

Microsoft
created on 2022-03-05

person 93.7
text 92.2
clothing 83
black and white 80.2
dance 76.8

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Fear 80.9%
Sad 12.2%
Calm 2.1%
Happy 1.8%
Disgusted 0.9%
Surprised 0.8%
Confused 0.7%
Angry 0.6%

AWS Rekognition

Age 35-43
Gender Male, 97.1%
Happy 45.7%
Calm 32.7%
Fear 17.3%
Surprised 1.4%
Sad 1.2%
Angry 0.6%
Confused 0.5%
Disgusted 0.5%

AWS Rekognition

Age 23-31
Gender Female, 75.5%
Happy 99.1%
Calm 0.6%
Surprised 0.1%
Fear 0.1%
Confused 0.1%
Sad 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 21-29
Gender Male, 91.9%
Fear 38.3%
Happy 34.1%
Calm 17.6%
Surprised 4.8%
Confused 2.2%
Sad 1.4%
Disgusted 1%
Angry 0.7%

AWS Rekognition

Age 22-30
Gender Male, 99.7%
Fear 34.3%
Calm 23%
Sad 10.7%
Happy 9.9%
Surprised 9.1%
Confused 8.1%
Angry 3.4%
Disgusted 1.4%

AWS Rekognition

Age 26-36
Gender Male, 71.8%
Calm 98.6%
Sad 0.8%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 26-36
Gender Female, 79.8%
Calm 98.1%
Sad 1.3%
Confused 0.4%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 16-24
Gender Male, 99.2%
Calm 98.9%
Happy 0.3%
Angry 0.2%
Sad 0.2%
Fear 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 21-29
Gender Male, 99.6%
Fear 53.9%
Happy 22%
Angry 9.3%
Sad 6.1%
Calm 5.8%
Surprised 1.1%
Confused 1%
Disgusted 0.8%

AWS Rekognition

Age 23-31
Gender Female, 56.8%
Calm 62%
Fear 30.3%
Happy 2.7%
Surprised 1.1%
Disgusted 1.1%
Confused 1%
Angry 1%
Sad 0.8%

AWS Rekognition

Age 22-30
Gender Female, 83.9%
Sad 41%
Calm 39.6%
Fear 11.3%
Surprised 3.9%
Happy 2.6%
Confused 0.7%
Angry 0.5%
Disgusted 0.4%

AWS Rekognition

Age 35-43
Gender Male, 91%
Calm 79.3%
Happy 13.9%
Sad 4.2%
Angry 0.9%
Disgusted 0.5%
Surprised 0.5%
Fear 0.4%
Confused 0.3%

AWS Rekognition

Age 23-33
Gender Male, 55.2%
Happy 82%
Calm 12.9%
Confused 2.6%
Disgusted 0.6%
Angry 0.6%
Sad 0.6%
Fear 0.4%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a person standing in front of a boat 37.8%

Text analysis

Amazon

MERICA
18135.
B
NAMT207
N

Google

18135-
18135m
18135- 18135m