Human Generated Data

Title

Untitled (circus performers standing on four moving horses/circus performers in ring)

Date

1966

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11855

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performers standing on four moving horses/circus performers in ring)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11855

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 98.8
Clothing 97.3
Apparel 97.3
Person 96.4
Person 96.3
Person 89.8
Person 87.4
Advertisement 87.2
Person 80.5
Horse 80.1
Animal 80.1
Mammal 80.1
Collage 79.2
Text 78.1
Person 78
Person 76.4
People 75.2
Poster 74.5
Horse 70.4
Person 69.7
Leisure Activities 69.6
Person 69.3
Accessories 68
Accessory 68
Tie 68
Face 62.6
Overcoat 58.4
Coat 58.4
Person 58.1
Female 58.1
Suit 57.5
Paper 57.3
Horse 56.8
Flyer 56.5
Brochure 56.5
Shorts 55.4

Clarifai
created on 2023-10-25

people 99.9
group together 96.7
monochrome 96.7
adult 96.6
wear 95.1
man 93.8
many 92.7
group 92.2
recreation 88.2
child 84.8
woman 84.8
print 84.1
street 83.8
canine 82.3
dog 79.7
illustration 79.4
athlete 77.8
sports equipment 77.1
competition 76.9
crowd 76.4

Imagga
created on 2022-01-15

daily 37.2
city 19.9
people 16.7
horse 16.5
man 14.8
building 13.5
sport 12.8
urban 12.2
travel 12
newspaper 11.8
outdoors 11.4
speed 11
architecture 10.9
shop 10.4
walking 10.4
street 10.1
person 9.9
horses 9.7
animal 9.7
men 9.4
beach 9.3
male 9.2
product 9.1
vacation 9
shoe shop 8.8
sand 8.7
run 8.7
day 8.6
adult 8.6
outdoor 8.4
graffito 8.3
transportation 8.1
farm 8
portrait 7.8
decoration 7.7
crowd 7.7
old 7.7
sky 7.6
two 7.6
house 7.5
color 7.2
life 7.2
art 7.2
work 7.1
creation 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 95.9
person 88.7
court 88.2
clothing 82.8
black and white 79.1
player 60.8
net 30.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 97.8%
Calm 90.9%
Sad 3.5%
Surprised 2.2%
Happy 1.3%
Confused 0.8%
Disgusted 0.7%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 22-30
Gender Male, 82.8%
Sad 32.6%
Calm 27.5%
Happy 15.8%
Angry 11.9%
Disgusted 6.7%
Fear 2.4%
Confused 2.3%
Surprised 0.9%

AWS Rekognition

Age 14-22
Gender Female, 70.4%
Happy 50.3%
Fear 39.8%
Calm 5.1%
Sad 1.8%
Confused 1.1%
Angry 0.7%
Disgusted 0.7%
Surprised 0.6%

AWS Rekognition

Age 23-31
Gender Male, 78.1%
Calm 39.1%
Happy 33.4%
Sad 15.4%
Angry 5.8%
Confused 2.4%
Surprised 2%
Fear 0.9%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Horse 80.1%
Poster 74.5%
Tie 68%

Categories

Text analysis

Amazon

10
92165
SLISS
G
9
T3302
VIII
VIII YY37A2
YY37A2

Google

420T 2.VEEIAEiA 59176. 55175
420T
2.VEEIAEiA
59176.
55175