Human Generated Data

Title

Untitled (three-ring circus practicing the show)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8573

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three-ring circus practicing the show)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8573

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 96.1
Human 96.1
Person 96
Person 90.2
Person 89.4
Person 88.6
Person 82.6
Horse 81.3
Mammal 81.3
Animal 81.3
Person 78.2
Person 75.5
Person 74
Horse 73.3
Elephant 72
Wildlife 72
Person 63.5
Person 61.5
Leisure Activities 60.5
Acrobatic 58.8
Person 54.3

Clarifai
created on 2023-10-26

people 99.2
cow 99.1
mammal 98.6
livestock 98.6
milk 98.4
cattle 98.3
cavalry 97.9
farm 97.2
group 96.4
animal 96.1
shed 95.3
monochrome 94.9
dairy 93.8
indoors 93.1
many 91.4
building 91
bull 90.7
two 90.6
mall 89.9
man 89.9

Imagga
created on 2022-01-09

architecture 23.4
supermarket 22.7
station 21
structure 20.7
building 20.4
urban 19.2
steel 19
construction 18.8
grocery store 18.4
travel 18.3
industry 17.9
city 17.5
sky 17.2
transportation 17
metal 16.1
modern 15.4
snow 15.2
road 14.4
interior 14.1
carousel 14.1
marketplace 13.8
river 13.3
street 12.9
transport 12.8
winter 12.8
business 12.8
industrial 12.7
window 12.3
dairy 12.1
ride 12.1
train 11.8
house 11.7
landscape 11.2
inside 11
power 10.9
glass 10.4
cold 10.3
mercantile establishment 10.3
rail 9.8
container 9.6
line 9.6
mechanical device 9.4
perspective 9.4
plant 9
tower 8.9
weather 8.9
light 8.7
roof 8.7
lamp 8.6
buildings 8.5
electric 8.4
device 8.4
outdoor 8.4
greenhouse 8.4
speed 8.2
indoor 8.2
technology 8.2
trees 8
mechanism 7.9
equipment 7.7
wheeled vehicle 7.7
iron 7.6
traffic 7.6
bridge 7.6
tourism 7.4
hall 7.4
water 7.3
tourist 7.2
airport 7.2
home 7.2
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

black and white 87.1
text 82.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 97.7%
Calm 60.9%
Surprised 11.9%
Disgusted 11.3%
Fear 4.9%
Confused 4%
Sad 3.8%
Happy 1.7%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 96.1%
Horse 81.3%
Elephant 72%

Categories

Text analysis

Amazon

42253
OS-
CHEC
MILZ

Google

422S
3
422S 3