Human Generated Data

Title

Untitled (man shining shoes in park)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7558

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man shining shoes in park)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7558

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Clothing 98.7
Apparel 98.7
Person 98.4
Person 94
Furniture 88.5
Text 73.6
Suit 70.9
Coat 70.9
Overcoat 70.9
Face 70.5
Female 69.3
Sitting 66.5
Chair 65.8
Gown 65.5
Fashion 65.5
People 64.4
Robe 62.7
Photography 61.6
Photo 61.6
Plant 58.5
Wedding 57.2
Evening Dress 56.3
Wedding Gown 56
Tree 55.7

Clarifai
created on 2023-10-25

people 99.4
administration 95.7
adult 95.3
many 94.3
man 93.5
war 93.2
group 92.3
military 91.9
woman 91.4
monochrome 90.6
group together 88.7
furniture 86.7
several 83.4
wear 83.4
skirmish 82.7
two 82.5
waste 80
newspaper 78
leader 77.5
street 77.2

Imagga
created on 2022-01-08

carousel 31.9
mosquito net 27.1
ride 26.5
sketch 22.5
drawing 20.7
mechanical device 20
freight car 19.4
protective covering 17.1
car 16.3
grunge 14.5
black 13.8
man 13.4
mechanism 13.4
representation 13.3
people 12.8
newspaper 12.8
covering 12.6
wheeled vehicle 12
dark 11.7
silhouette 10.8
vintage 10.7
light 10.7
night 10.7
vehicle 10.2
person 10
water 10
product 9.9
art 9.9
old 9.8
device 9.2
dirty 9
creation 8.8
house 8.4
fashion 8.3
music 8.1
architecture 7.8
sitting 7.7
winter 7.7
city 7.5
style 7.4
building 7.2
history 7.2
park 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.4
black and white 88.4
tree 73.5
monochrome 60.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 98.6%
Calm 87.4%
Sad 6.1%
Happy 2.7%
Disgusted 1.8%
Surprised 0.9%
Confused 0.5%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Male, 99.4%
Calm 98.9%
Surprised 0.4%
Happy 0.4%
Angry 0.1%
Fear 0.1%
Sad 0.1%
Confused 0%
Disgusted 0%

Feature analysis

Amazon

Person 99.1%

Categories

Captions

Microsoft
created on 2022-01-08

text 43.2%

Text analysis

Amazon

28716.

Google

28716.
प्रत 28716.
प्रत