Human Generated Data

Title

Untitled (circus performers: three acrobats riding on one unicylcle)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5447

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performers: three acrobats riding on one unicylcle)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.3
Person 99.3
Person 98
Machine 94
Wheel 94
Acrobatic 91.4
Vehicle 88.6
Bike 88.6
Bicycle 88.6
Transportation 88.6
Person 88.6
Person 82.5
Town 69.3
Urban 69.3
Building 69.3
City 69.3
Sculpture 62.5
Art 62.5
Clothing 62.4
Apparel 62.4
Leisure Activities 60.1

Imagga
created on 2022-01-23

sports equipment 100
cricket equipment 100
cricket bat 98.5
wicket 79.9
equipment 59.5
sky 30
freedom 23.8
sport 21.4
statue 21.4
monument 20.6
fun 20.2
outdoor 19.1
lifestyle 18.1
summer 18
outdoors 17.9
jump 17.3
architecture 17.2
people 16.7
joy 16.7
city 16.6
grass 16.6
travel 16.2
happiness 15.7
active 15.3
building 15.1
man 14.8
person 14.4
adult 14.2
action 13.9
landmark 13.5
jumping 13.5
happy 13.2
spring 12.6
leisure 12.5
sculpture 12.4
park 12.4
clouds 11.8
torch 11.8
stone 11.8
memorial 11.8
body 11.2
fly 11.2
health 11.1
air 11
teenager 10.9
activity 10.7
tourism 10.7
fountain 10.2
structure 10.2
light 10.2
symbol 10.1
energy 10.1
healthy 10.1
exercise 10
fitness 9.9
human 9.7
obelisk 9.5
vacation 9
color 8.9
urban 8.7
marble 8.7
motion 8.6
male 8.5
field 8.4
sun 8.1
athlete 8
boy 7.8
art 7.8
cloud 7.8
attractive 7.7
vitality 7.5
relaxation 7.5
enjoyment 7.5
water 7.3
playing 7.3
history 7.2
to 7.1
ball 7.1
life 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
statue 97.6
outdoor 86.4
black and white 82
sky 74.4
sculpture 72.9
cloud 72.1

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 74.7%
Calm 61.9%
Sad 27.2%
Angry 5.2%
Confused 2.8%
Surprised 1.4%
Happy 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 28-38
Gender Female, 58.4%
Calm 97.5%
Surprised 1.3%
Sad 0.5%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Calm 95.3%
Sad 2.8%
Happy 0.9%
Surprised 0.4%
Angry 0.2%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Wheel 94%
Bicycle 88.6%

Captions

Microsoft

a vintage photo of a person 87.9%
a vintage photo of a person with a racket 73.8%
a vintage photo of a person standing in front of a crowd 73.7%

Text analysis

Amazon

22859.
18
KODAK-SVLEIA

Google

KODVK-2YLE 22859.
22859.
KODVK-2YLE