Human Generated Data

Title

Untitled (large group of women standing with arms raised)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7152

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (large group of women standing with arms raised)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7152

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 98.2
Person 98.2
Marching 98
Crowd 98
Person 96.8
Audience 96.5
Person 91.2
Person 89.6
Person 82.8
People 73.7
Person 69.8
Parade 55.9

Clarifai
created on 2023-10-15

people 99.8
crowd 99.7
many 98.6
man 96.8
group 96.4
military 93.1
war 93
soldier 92.2
conformity 90.7
uniform 90.4
group together 90.3
music 89.8
audience 89.7
woman 88.9
adult 88.1
school 85.6
army 84.2
administration 83.6
child 83.6
education 79.9

Imagga
created on 2021-12-15

meat hook 100
hook 95.9
crowd 18.2
black 16.8
group 15.3
design 13.5
wood 12.5
silhouette 12.4
party 12
old 11.8
texture 11.1
support 10.9
body 10.4
art 10
metal 9.7
people 9.5
grunge 9.4
equipment 9.3
male 9.2
color 8.9
fence 8.8
wall 8.8
close 8.6
iron 8.4
row 8.4
pattern 8.2
landscape 8.2
detail 8
scene 7.8
travel 7.7
men 7.7
decoration 7.7
chain 7.6
outdoors 7.5
baluster 7.5
shape 7.4
clothing 7.4
vintage 7.4
graphic 7.3
business 7.3
music 7.2

Google
created on 2021-12-15

Gesture 85.3
Font 82.4
Crowd 74.9
Snapshot 74.3
Event 73.2
Monochrome 72.2
Crew 71.8
Monochrome photography 71.5
Photo caption 69.4
Suit 63.8
Team 61.7
History 60.9
Metal 58.6
Audience 56.7
Room 53.1
Art 50

Microsoft
created on 2021-12-15

person 99.7
posing 99
text 98.4
group 95.5
outdoor 94.7
drawing 89.8
sketch 86.1
black 80.6
people 72
clothing 53.1
team 29.3
male 16.1
crowd 3.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 78.6%
Calm 97.1%
Sad 1.1%
Happy 0.9%
Surprised 0.7%
Confused 0.1%
Angry 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 21-33
Gender Female, 77%
Calm 81.8%
Happy 6.9%
Surprised 6.1%
Fear 1.7%
Sad 1.4%
Confused 1.2%
Angry 0.6%
Disgusted 0.2%

AWS Rekognition

Age 23-35
Gender Male, 64.3%
Happy 56.4%
Calm 38.9%
Sad 3.1%
Confused 0.5%
Surprised 0.5%
Angry 0.4%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 24-38
Gender Male, 98.8%
Calm 70.1%
Surprised 22.9%
Fear 2.8%
Angry 1.4%
Sad 1.3%
Confused 1%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 23-35
Gender Female, 60.8%
Calm 91.8%
Surprised 3.2%
Sad 2.2%
Happy 1.8%
Confused 0.9%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 28-44
Gender Male, 54.8%
Calm 85.3%
Sad 9.7%
Confused 3.5%
Angry 0.8%
Surprised 0.3%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 44-62
Gender Male, 97.7%
Calm 90.7%
Sad 5.1%
Happy 2.4%
Confused 0.7%
Disgusted 0.5%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 24-38
Gender Female, 53.4%
Calm 97.2%
Sad 1.4%
Happy 0.7%
Surprised 0.3%
Angry 0.2%
Confused 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-50
Gender Male, 98.8%
Calm 93.1%
Sad 1.8%
Angry 1.6%
Happy 1.2%
Confused 1.1%
Surprised 0.8%
Disgusted 0.3%
Fear 0%

AWS Rekognition

Age 23-35
Gender Female, 75.5%
Sad 89.7%
Calm 6%
Confused 3.7%
Angry 0.2%
Fear 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 36-54
Gender Female, 84.8%
Calm 64.4%
Happy 19.5%
Sad 10.1%
Fear 2.8%
Confused 1.3%
Surprised 1%
Disgusted 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%

Text analysis

Amazon

AMERICA
HALE AMERICA
HALE
5
18132A.
B

Google

18132 A. MERICO 18132A.
18132
MERICO
A.
18132A.