Human Generated Data

Title

Untitled (people dressed in circus animal costumes)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7650

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people dressed in circus animal costumes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.5
Human 98.5
Person 98.2
Person 96.9
Person 96.7
Person 95.6
Person 95.6
Person 94.6
Person 89
Person 82.8
Leisure Activities 78
Person 77.9
People 73.8
Musician 72.8
Musical Instrument 72.8
Crowd 71.8
Person 70.6
Clothing 67
Apparel 67
Person 66.1
Music Band 62.6
Text 61.9
Stage 61.7
Art 58.2
Drawing 58.2
Shorts 57.2
Performer 55.6
Person 49
Person 48.3

Imagga
created on 2022-01-08

people 24
man 23.5
person 23.2
brass 21.4
wind instrument 18.3
helmet 17.9
sport 17.3
male 16.3
crowd 15.4
adult 14.6
men 14.6
musical instrument 14.5
silhouette 14.1
trombone 14
clothing 13.7
city 13.3
black 12.6
football helmet 12.3
human 12
symbol 11.4
group 10.5
mask 10.4
occupation 10.1
business 9.7
flag 9.4
danger 9.1
portrait 9.1
uniform 9
team 9
world 8.8
urban 8.7
women 8.7
costume 8.6
lights 8.3
protection 8.2
headdress 8.2
athlete 8.1
activity 8.1
dance 8
stadium 7.9
nighttime 7.8
audience 7.8
scene 7.8
motion 7.7
patriotic 7.7
stage 7.6
fashion 7.5
dark 7.5
art 7.4
event 7.4
active 7.3
breastplate 7.3
speed 7.3
competition 7.3
design 7.3
work 7.1
astronaut 7.1
working 7.1
day 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

drawing 97.7
outdoor 96.5
clothing 95.3
person 94.6
text 94.5
sketch 92
cartoon 88.5
standing 79.1
group 77.9
posing 75.7
painting 63.9
people 62.1
black and white 50.8

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 97.3%
Calm 42%
Fear 18.7%
Disgusted 18.2%
Surprised 12.4%
Happy 3.7%
Confused 2.1%
Sad 1.5%
Angry 1.3%

AWS Rekognition

Age 25-35
Gender Male, 99.4%
Confused 68%
Sad 24.4%
Disgusted 2.7%
Calm 2.7%
Angry 0.7%
Happy 0.7%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 25-35
Gender Male, 62.1%
Calm 56%
Sad 36.7%
Confused 3.2%
Happy 1.7%
Disgusted 0.8%
Angry 0.7%
Fear 0.5%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a group of people posing for a photo 95.5%
a group of people posing for the camera 95.4%
a group of people posing for a picture 95.3%

Text analysis

Amazon

28452.
SP
YT37A2-MAGOX

Google

28452. YT37A2-YAGO
28452.
YT37A2-YAGO