Human Generated Data

Title

Untitled (multiple circus performers riding on the same horse)

Date

1966

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11853

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (multiple circus performers riding on the same horse)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11853

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 98.7
Human 98.7
Person 98.5
Horse 98.3
Mammal 98.3
Animal 98.3
Clothing 96.1
Apparel 96.1
Horse 90.9
Person 88.8
Person 87
Person 86.2
Person 75.7
Female 71.2
Dog 69.3
Canine 69.3
Pet 69.3
Person 68.6
Person 64.9
Cow 64
Cattle 64
Gown 59.7
Fashion 59.7
Collage 59.4
Advertisement 59.4
Poster 59.4
Road 58.1
Text 57.9
Robe 57.6
Bridegroom 56.6
Wedding 56.6

Clarifai
created on 2023-10-25

people 99.8
cavalry 99.5
monochrome 98.7
group together 97.5
transportation system 96.4
man 95.4
mammal 95.1
recreation 94.7
vehicle 93.9
many 93.4
seated 93.4
competition 92.8
group 92.7
adult 91.6
street 90.3
child 88.7
circus 88.5
spectator 86.6
three 84.9
fun 84.4

Imagga
created on 2022-01-15

graffito 30.8
decoration 24.5
building 21.1
architecture 19.7
window 18
structure 17.7
old 17.4
city 16.6
pattern 13.7
wall 13.5
stone 12.3
detail 12.1
travel 12
modern 11.9
texture 11.8
house 11.7
ancient 11.2
device 11.1
business 10.9
design 10.7
exterior 10.1
tourism 9.9
urban 9.6
art 9.2
vintage 9.1
technology 8.9
light 8.7
grunge 8.5
wallpaper 8.4
sky 8.3
aged 8.1
landmark 8.1
sculpture 8
web 7.8
black 7.8
door 7.8
glass 7.8
buildings 7.6
town 7.4
fence 7.4
street 7.4
historic 7.3
backgrounds 7.3
night 7.1
summer 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

black and white 94.5
text 85.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 92.8%
Calm 79.1%
Sad 10.8%
Happy 4.3%
Angry 1.7%
Fear 1.1%
Confused 1.1%
Surprised 1%
Disgusted 0.9%

AWS Rekognition

Age 35-43
Gender Female, 98.3%
Sad 42%
Calm 30.6%
Disgusted 8.5%
Happy 8.4%
Confused 3.5%
Angry 2.8%
Surprised 2.2%
Fear 2%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Happy 41.4%
Calm 37%
Fear 15.9%
Sad 2.7%
Angry 1.2%
Surprised 0.9%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 45-51
Gender Male, 77.4%
Calm 95.1%
Sad 2.1%
Confused 1.1%
Fear 0.7%
Disgusted 0.3%
Angry 0.3%
Happy 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Horse 98.3%
Dog 69.3%
Cow 64%

Categories

Text analysis

Amazon

555
555 86.
86.
:
KODYK
: , KODYK
NAGOA
,
68855
1 ١٢ : :
١٢
1

Google

55587. 555 86.
55587.
555
86.