Human Generated Data

Title

Untitled (men standing next to overturned car, Sarasota, FL)

Date

1948

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5416

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men standing next to overturned car, Sarasota, FL)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5416

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99
Human 99
Person 98.6
Person 97.7
Person 97
Person 96.6
Person 95.9
Person 95.5
Person 95
Person 95
Person 89.9
Person 85
Art 79.3
Drawing 77.3
Person 76.8
Outdoors 73.9
Nature 73.4
Person 72.3
Wheel 68.1
Machine 68.1
People 66.5
Workshop 60.6
Clothing 59
Apparel 59
Sketch 57.3
Tree 57
Plant 57

Clarifai
created on 2023-10-26

vehicle 99.9
people 99.7
transportation system 98.4
military 98.1
group together 97.6
war 96.8
adult 96.5
group 96.1
soldier 95.1
skirmish 92
many 90.2
vintage 89.5
campsite 89.4
driver 87.7
aircraft 87
man 86.3
car 86.2
cavalry 84.3
army 83.9
military vehicle 83.9

Imagga
created on 2022-01-23

landscape 23.1
snow 19.7
sky 18.5
man 16.8
tree 16.2
old 16
travel 14.8
mountain 14.6
graffito 14.6
winter 14.5
cemetery 14.2
vacation 13.9
trees 13.3
people 12.8
danger 12.7
person 12.5
park 11.5
outdoor 11.5
forest 11.3
building 11.3
clouds 11
field 10.9
male 10.7
outdoors 10.5
rock 10.4
architecture 10.2
light 10
history 9.8
summer 9.7
weather 9.6
sport 9.5
adventure 9.5
decoration 9.2
scenic 8.8
fog 8.7
ancient 8.7
cold 8.6
sand 8.5
hill 8.4
stone 8.4
dark 8.4
tourist 8.2
industrial 8.2
road 8.1
sunset 8.1
farm 8
rural 7.9
work 7.9
country 7.9
urban 7.9
adult 7.8
destruction 7.8
high 7.8
season 7.8
culture 7.7
horse 7.6
structure 7.6
water 7.3
dirty 7.2
world 7.2
grass 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

outdoor 99.5
grass 97.4
text 97.1
wheel 92.3
tractor 92.1
drawing 89.1
vehicle 88.3
tire 87.2
land vehicle 87.1
black and white 81.4
sketch 76.1
auto part 72
old 61.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 98.6%
Happy 62%
Calm 25.6%
Surprised 4.8%
Angry 2.5%
Sad 1.7%
Disgusted 1.4%
Fear 1.1%
Confused 0.9%

AWS Rekognition

Age 23-33
Gender Male, 97.3%
Calm 48%
Happy 27.8%
Disgusted 12.6%
Angry 3.4%
Surprised 3.1%
Confused 2.1%
Sad 1.8%
Fear 1.2%

AWS Rekognition

Age 24-34
Gender Male, 77.3%
Calm 96.2%
Sad 1.7%
Happy 1.1%
Angry 0.4%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 40-48
Gender Female, 93.9%
Calm 65.6%
Happy 24.3%
Sad 3.7%
Angry 2.9%
Surprised 1.3%
Disgusted 1%
Confused 0.7%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99%
Wheel 68.1%

Categories

Text analysis

Amazon

24070

Google

OLO
OLO