Human Generated Data

Title

Untitled (men seated in Hookers Klub car on train)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8192

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men seated in Hookers Klub car on train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8192

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Person 98.4
Person 92.4
Person 91.7
Person 80.6
Art 73.5
People 66.7
Drawing 65.3
Person 65
Face 64
Clothing 63.4
Apparel 63.4
Sketch 58.1
Canvas 57.2
Collage 56.4
Advertisement 56.4
Poster 56.4
LCD Screen 55.2
Electronics 55.2
Screen 55.2
Monitor 55.2
Display 55.2
Person 41.9

Clarifai
created on 2023-10-25

people 99.9
child 98.2
group 98
monochrome 96.5
adult 95.6
woman 94.8
several 94.1
man 93.4
indoors 92.7
group together 92.1
sit 91.7
administration 90.8
room 90.7
wear 89.1
many 88.3
leader 86.8
war 86.6
chair 85.4
furniture 85.3
recreation 83.7

Imagga
created on 2022-01-08

newspaper 16.5
grunge 16.2
cockpit 14.6
incubator 14.2
old 13.2
equipment 13.2
product 12.6
currency 12.6
house 12.5
cash 11.9
city 11.6
vintage 11.6
business 11.5
apparatus 11.3
texture 11.1
finance 11
pattern 10.9
negative 10.4
art 10.4
film 10.2
vehicle 10.2
dollar 10.2
bank 9.9
creation 9.9
home 9.6
shower curtain 9.4
protective covering 9.2
architecture 8.8
screen 8.8
bill 8.6
money 8.5
windshield 8.3
investment 8.2
retro 8.2
aged 8.1
car 8.1
building 7.9
urban 7.9
people 7.8
antique 7.8
travel 7.7
furnishing 7.7
curtain 7.5
covering 7.4
town 7.4
economy 7.4
man 7.4
paper 7.4
person 7.3
metal 7.2
photographic paper 7.2
wealth 7.2
financial 7.1
interior 7.1
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.1
black and white 84.2
newspaper 82.2
monochrome 66.2
person 62.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 51.3%
Calm 99.8%
Happy 0.1%
Confused 0%
Disgusted 0%
Sad 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 97.3%
Calm 91.8%
Sad 7.1%
Confused 0.4%
Surprised 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 27-37
Gender Male, 98.8%
Calm 99.6%
Sad 0.2%
Surprised 0.1%
Confused 0%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Text analysis

Amazon

8his
8his YE3RAS
YE3RAS