Human Generated Data

Title

Untitled (elephants performing in outdoor circus)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8859

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (elephants performing in outdoor circus)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 97.1
Human 97.1
Shelter 96.2
Countryside 96.2
Rural 96.2
Nature 96.2
Outdoors 96.2
Building 96.2
Person 95.8
Person 95.8
Person 95.5
Person 95.2
Person 82.4
Person 81.8
Crowd 78.3
Person 77.3
Airplane 73.1
Transportation 73.1
Vehicle 73.1
Aircraft 73.1
Person 70.8
Poster 69.7
Advertisement 69.7
People 68.1
Meal 65.5
Food 65.5
Urban 63.8
Text 61.1
Person 60.3
Person 60.1
Word 57.1
Architecture 56.3
Audience 55.6
Theme Park 55.6
Amusement Park 55.6
Person 55.1
Person 44.5

Imagga
created on 2022-01-15

stage 64.2
platform 50.4
city 31.6
architecture 29.7
night 21.3
sky 21
travel 20.4
building 18.9
urban 16.6
history 15.2
cockpit 14.9
water 14.7
landscape 14.1
light 14
landmark 13.5
old 13.2
tourism 13.2
center 12.3
famous 12.1
house 11.7
cityscape 11.4
town 11.1
construction 11.1
tower 10.7
panorama 10.5
freight car 10.5
buildings 10.4
structure 10.2
clouds 10.1
street 10.1
houses 9.7
downtown 9.6
wheeled vehicle 9.5
skyline 9.5
bridge 9.5
sea 9.4
church 9.2
tree 9.2
vacation 9
car 9
sunset 9
coast 9
river 8.9
district 8.7
port 8.7
ancient 8.6
park 8.6
capital 8.5
historical 8.5
vehicle 8.4
black 8.4
summer 8.4
lights 8.3
ocean 8.3
tourist 8.2
scene 7.8
grunge 7.7
sculpture 7.6
stone 7.6
monument 7.5
historic 7.3

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.7
black and white 90
sky 74.9
house 72.1
people 67.8
display 55.8
crowd 46.6
watching 43
picture frame 18.5
screenshot 17.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Female, 59.4%
Calm 47.8%
Sad 34.6%
Confused 5.9%
Happy 4.8%
Disgusted 3.1%
Fear 1.9%
Angry 1.2%
Surprised 0.7%

Feature analysis

Amazon

Person 97.1%
Airplane 73.1%

Captions

Microsoft

a group of people standing in front of a crowd 85.8%
a group of people in front of a crowd 85.7%
a group of people watching a band on stage in front of a crowd 78.2%

Text analysis

Amazon

39567.
KODAK
EIFW

Google

KODVK 2.v LEI A LIrW KODVK 2.v LEIA LITW
KODVK
2.v
LEI
A
LIrW
LEIA
LITW