Human Generated Data

Title

Untitled (group of men standing near overturned car, Sarasota, FL)

Date

1948

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5417

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men standing near overturned car, Sarasota, FL)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5417

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Wheel 99
Machine 99
Person 98.7
Human 98.7
Person 97.3
Person 96.9
Person 96.4
Person 95.1
Person 95
Person 93.1
Person 92.9
Person 89
Person 88.8
Person 83.5
Vehicle 81.7
Transportation 81.7
Wheel 81.7
Drawing 80.5
Art 80.5
People 77.9
Car 74.2
Automobile 74.2
Person 70.2
Funeral 63
Spoke 62.3
Face 60.2
Urban 59.6
Sketch 58.7
Plant 58.7
Housing 57.2
Building 57.2
Person 42.8
Person 42.2

Clarifai
created on 2023-10-27

vehicle 99.6
people 99.3
adult 96.4
group together 95
war 95
military 94.4
transportation system 94.3
soldier 92.3
man 92.1
car 91.3
group 91
administration 86.5
skirmish 86.3
street 83.1
vintage 81
retro 79.4
woman 78.7
wear 78.4
monochrome 77.9
tank 77.3

Imagga
created on 2022-01-23

vehicle 25.9
snow 22.9
old 22.3
bench 21.9
park bench 20.3
wheeled vehicle 19
landscape 17.8
car 17.3
tree 16.2
transportation 16.1
sky 15.3
winter 14.5
house 14.2
rural 14.1
vintage 14.1
seat 14
grunge 13.6
sand 13.5
travel 13.4
dirt 13.4
trees 13.3
structure 12.9
road 12.6
building 12.5
machine 12.5
outdoor 12.2
cold 12.1
transport 11.9
industrial 11.8
dirty 11.7
architecture 11.7
park 11.5
construction 11.1
scene 10.4
antique 10.4
black 10.2
retro 9.8
outdoors 9.8
roof 9.6
work 9.6
industry 9.4
season 9.4
stone 9.3
field 9.2
city 9.1
motor vehicle 9
farm 8.9
trailer 8.9
furniture 8.8
snowy 8.8
outside 8.6
wood 8.3
land 8.3
tractor 7.9
dust 7.8
ancient 7.8
housing 7.7
clouds 7.6
weathered 7.6
desert 7.6
drive 7.6
wheel 7.5
frame 7.5
street 7.4
water 7.3
danger 7.3
mobile home 7.2
scenery 7.2
holiday 7.2
history 7.2
grass 7.1
country 7
scenic 7
seasonal 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.4
outdoor 98.3
grass 96.9
old 87
drawing 86.8
vehicle 84.2
black and white 81.8
sketch 78.1
land vehicle 72.9
military vehicle 68.1
white 60
wheel 54.9
drawn 46.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 62%
Fear 62.6%
Sad 12.1%
Calm 9.3%
Happy 8.2%
Angry 2.9%
Surprised 2.8%
Disgusted 1.4%
Confused 0.7%

AWS Rekognition

Age 36-44
Gender Male, 75.4%
Calm 40.5%
Fear 27.5%
Sad 10.9%
Happy 7.3%
Surprised 4.6%
Confused 4.1%
Angry 3.2%
Disgusted 1.8%

AWS Rekognition

Age 23-33
Gender Female, 82.1%
Calm 52.5%
Fear 31.2%
Sad 9.2%
Happy 4%
Angry 1.1%
Disgusted 1%
Confused 0.7%
Surprised 0.4%

AWS Rekognition

Age 48-56
Gender Male, 85.2%
Calm 98.1%
Happy 0.6%
Angry 0.4%
Sad 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%
Confused 0%

AWS Rekognition

Age 20-28
Gender Female, 65.4%
Happy 98%
Surprised 0.6%
Fear 0.6%
Calm 0.5%
Angry 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Male, 75.9%
Fear 78.5%
Surprised 5.7%
Happy 4.6%
Sad 3.3%
Calm 3%
Disgusted 2.3%
Angry 1.4%
Confused 1.1%

AWS Rekognition

Age 18-26
Gender Female, 90.7%
Calm 88.5%
Happy 5.6%
Sad 2.7%
Fear 1.1%
Disgusted 0.7%
Surprised 0.5%
Confused 0.4%
Angry 0.4%

Feature analysis

Amazon

Wheel 99%
Person 98.7%

Text analysis

Amazon

24060

Google

24060
24060