Human Generated Data

Title

Untitled (man in chicken coop feeding chickens, Jacksonville, FL)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5497

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man in chicken coop feeding chickens, Jacksonville, FL)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Building 99
Factory 95.7
Person 92
Human 92
Corridor 90.5
Outdoors 90.1
Nature 89.7
Person 74
Person 72.8
Person 69.5
Car 68.7
Transportation 68.7
Vehicle 68.7
Automobile 68.7
Train 68.1
Person 67.7
Countryside 66.5
Assembly Line 59.9
Shack 59
Rural 59
Hut 59
Prison 58.5
Person 56.9
Urban 56.2
Handrail 55.9
Banister 55.9
Person 52.2

Imagga
created on 2022-01-23

railing 41.9
barrier 41
structure 38.3
architecture 34.5
building 34
city 29.1
obstruction 28.1
bridge 25
gate 23.7
urban 23.6
steel 22.1
turnstile 20.2
travel 18.3
construction 18
modern 16.8
greenhouse 16.3
transportation 16.1
metal 16.1
water 15.3
river 15.1
interior 15
station 14.8
glass 14
business 13.4
industrial 12.7
light 12.7
movable barrier 12.3
window 12.1
old 11.8
support 11.6
sky 11.5
center 11.3
industry 11.1
inside 11
train 10.8
road 10.8
tourism 10.7
transport 10
futuristic 9.9
escalator 9.9
staircase 9.8
walkway 9.8
office 9.6
walk 9.5
motion 9.4
way 9.3
landmark 9
plant 8.9
landscape 8.9
subway 8.9
stair 8.8
indoors 8.8
steps 8.8
device 8.8
move 8.6
store 8.5
perspective 8.5
park 8.4
historic 8.2
vacation 8.2
technology 8.2
reflection 8.1
new 8.1
stairs 8
stairway 7.9
step 7.8
airport 7.8
factory 7.7
concrete 7.6
track 7.4
design 7.3
pipe 7.1
wall 7.1
pier 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 77.2%
Sad 70.3%
Happy 8.6%
Calm 7.9%
Confused 7.7%
Angry 1.6%
Surprised 1.3%
Fear 1.3%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 92%
Car 68.7%
Train 68.1%

Captions

Microsoft

a group of people in a cage 70%
a close up of a cage 69.9%
a group of people standing in a cage 58.7%

Text analysis

Amazon

23102
BROILER
tested
BROILER FEB
Farin tested
Farin
FEB
Parr
Designst
20182
Designst NOTE
88
NOTE

Google

farro SRDILER FE 23102
farro
SRDILER
FE
23102