Human Generated Data

Title

Untitled (two young children in playpen)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8342

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two young children in playpen)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8342

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Furniture 99.9
Railing 98.4
Person 93.1
Human 93.1
Person 90.8
Chair 87.6
Crib 87.2
Face 84.1
Bed 76.7
Apparel 75.1
Clothing 75.1
Fence 62.7
Kid 59.6
Child 59.6
Water 59.4
Waterfront 59.4
Suit 59.1
Coat 59.1
Overcoat 59.1
Outdoors 56.3
Banister 55.1
Handrail 55.1

Clarifai
created on 2023-10-25

child 99.8
people 99.6
boy 96.8
monochrome 95.9
one 94
two 92.1
man 91.2
adult 89.7
three 88.6
recreation 88.3
group 87.6
indoors 87.1
woman 85.8
wear 85.2
baby 84.7
portrait 84.2
step 84.1
handrail 83.2
fence 82.7
administration 82.6

Imagga
created on 2022-01-09

bobsled 32.7
sled 26.1
vehicle 26
cockpit 25.4
structure 24.1
industry 20.5
steel 20.3
urban 19.2
industrial 19
transportation 18.8
building 17
metal 16.9
interior 16.8
window 16.5
construction 16.2
conveyance 16.1
architecture 15.7
modern 15.4
power 15.1
city 14.9
man 14.9
inside 14.7
balcony 14.3
device 13.5
silhouette 13.2
people 12.8
business 12.7
travel 12.7
light 12
seat 12
equipment 12
airport 11.9
factory 11.7
helmet 11.2
glass 10.9
station 10.6
reflection 10.5
sky 10.2
passenger 10.1
gate 9.9
vacation 9.8
technology 9.6
engineering 9.5
work 9.4
iron 9.3
energy 9.2
air 9.2
protection 9.1
departure 8.8
pipe 8.7
pollution 8.6
journey 8.5
fly 8.4
security 8.3
environment 8.2
transport 8.2
danger 8.2
office 8
deck 7.9
high 7.8
production 7.8
check 7.7
fuel 7.7
roof 7.7
flight 7.7
old 7.6
outdoor 7.6
plant 7.6
perspective 7.5
fun 7.5
car 7.5
safety 7.4
water 7.3
worker 7.1
night 7.1
support 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.3
black and white 95.3
toddler 89.9
person 88.3
human face 88
child 83.5
boy 83.3
baby 80
monochrome 68.4
playground 65.9
clothing 63.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-14
Gender Female, 97.2%
Fear 83.7%
Calm 11%
Surprised 2.8%
Angry 1.2%
Disgusted 0.6%
Confused 0.3%
Sad 0.2%
Happy 0.2%

Feature analysis

Amazon

Person 93.1%
Crib 87.2%

Captions

Microsoft
created on 2022-01-09

a boy in a cage 84.7%
a group of people in a cage 84.6%
a boy standing in front of a fence 68.9%

Text analysis

Amazon

12061
1206
12061.
a
1206 a KODAN 12061.
KODAN

Google

12061 12061 12061 YAGOX-YT37A2-AMT 12 06
12061
YAGOX-YT37A2-AMT
12
06