Human Generated Data

Title

Untitled (acrobats in yard)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4538

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (acrobats in yard)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4538

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.6
Person 99.6
Nature 99.2
Outdoors 98.9
Car 98.8
Transportation 98.8
Vehicle 98.8
Automobile 98.8
Building 98.8
Housing 98.5
Shelter 97.2
Countryside 97.2
Rural 97.2
House 93.3
Person 90.8
Hut 88.7
Person 86.4
Shack 86.2
Wheel 85.4
Machine 85.4
Wheel 80.8
Cabin 78.8
Cottage 75
Drawing 55.8
Art 55.8
Yard 55.2

Clarifai
created on 2023-10-15

monochrome 97.5
house 96.4
home 94.4
people 93.1
shed 92.9
calamity 92.5
black and white 92.2
building 92.1
wood 91.1
vintage 90.1
nature 89.1
storm 88.8
weather 88.5
street 86.9
tree 86.3
group 85.9
vehicle 84.9
construction 84.6
winter 83.6
no person 83.2

Imagga
created on 2021-12-14

snow 60.8
structure 44.7
building 42.4
weather 35.9
greenhouse 30.6
fence 27.5
picket fence 25.6
sky 25.5
architecture 20.3
old 20.2
house 19.2
winter 18.7
grunge 17.9
barrier 16.6
vintage 16.5
landscape 16.4
trees 16
cold 15.5
city 15
chairlift 14.1
wheeled vehicle 13.8
conveyance 13
shopping cart 12.6
tree 12.5
season 11.7
urban 11.4
travel 11.3
design 11.3
ski tow 11.2
obstruction 11.1
construction 11.1
texture 11.1
retro 10.7
rural 10.6
scene 10.4
town 10.2
handcart 10.1
mobile home 9.7
housing 9.7
forest 9.6
home 9.6
antique 9.5
light 9.4
power 9.2
field 9.2
dirty 9
pattern 8.9
grass 8.7
skyline 8.6
clouds 8.5
summer 8.4
ice 8.3
style 8.2
tower 8.1
river 8
holiday 7.9
trailer 7.7
vehicle 7.7
dark 7.5
wind 7.5
silhouette 7.5
mountains 7.4
border 7.2
road 7.2
scenery 7.2
black 7.2
hut 7.2

Google
created on 2021-12-14

Building 93.1
Black 89.5
Black-and-white 85.1
Plant 84.2
Tree 83.9
Adaptation 79.4
House 78.9
Tints and shades 77.2
Monochrome photography 76.8
Monochrome 76.4
Twig 73.3
Cottage 72.9
Wood 69.6
Shack 65.4
Roof 64.9
Event 64.8
Stock photography 64.6
Town 64.5
Motor vehicle 64.4
Landscape 63.5

Microsoft
created on 2021-12-14

tree 99.8
outdoor 99.4
black and white 91.6
text 90.8
house 88.5
drawing 74

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-51
Gender Male, 83.6%
Calm 91.6%
Happy 3.8%
Sad 2.4%
Angry 0.9%
Surprised 0.6%
Disgusted 0.3%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 47-65
Gender Male, 78.9%
Calm 63.4%
Surprised 16.4%
Sad 6.9%
Confused 6%
Happy 2.7%
Fear 2.5%
Angry 1.7%
Disgusted 0.5%

AWS Rekognition

Age 1-5
Gender Female, 98.8%
Happy 47.2%
Calm 33.7%
Disgusted 9.9%
Sad 3.4%
Confused 1.9%
Angry 1.9%
Fear 1.1%
Surprised 0.8%

Feature analysis

Amazon

Person 99.6%
Car 98.8%
Wheel 85.4%

Categories

Captions

Microsoft
created on 2021-12-14

an old photo of a person 48.4%
old photo of a person 42.5%
a close up of a sign 42.4%

Text analysis

Amazon

28