Human Generated Data

Title

Untitled (man leaning on car and holding a rifle)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5130

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man leaning on car and holding a rifle)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5130

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.8
Human 98.8
Clothing 96.5
Apparel 96.5
Face 84
Coat 74.1
Car 68.1
Transportation 68.1
Vehicle 68.1
Automobile 68.1
Photography 62.7
Photo 62.7
Portrait 62.7
Suit 57.5
Overcoat 57.5

Clarifai
created on 2023-10-26

people 99.2
monochrome 97.8
man 95.5
adult 94.7
uniform 88.4
lid 86.2
portrait 84.9
one 81.3
luxury 81.1
indoors 80.6
gloves 79.7
black and white 78.7
scientist 78.5
car 78.3
facial expression 77.3
actor 76.9
transportation system 76.5
wear 75.7
wedding 75.3
vehicle 73

Imagga
created on 2022-01-23

person 20.9
man 19.1
people 17.3
male 16.3
fashion 15.8
art 15.2
silhouette 14.9
human 14.2
sexy 12.9
black 12.1
face 12.1
style 11.9
model 11.7
portrait 11
business 10.9
science 10.7
player 10.6
attractive 10.5
body 10.4
adult 10.3
stylish 9.9
team 9.9
businessman 9.7
design 9.6
hair 9.5
sensual 9.1
suit 9
space 8.5
future 8.4
sport 8.2
technology 8.2
negative 8.1
statue 8.1
posing 8
lifestyle 7.9
ball 7.9
youth 7.7
mystic 7.5
glowing 7.4
artwork 7.3
make 7.3
music 7.2
bright 7.1
vibrant 7

Microsoft
created on 2022-01-23

text 99.2
person 86
outdoor 85.4
vehicle 77.9
car 76.5
land vehicle 75.8
black and white 70.5
clothing 56.3
posing 42.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 97.6%
Confused 1%
Surprised 0.4%
Sad 0.4%
Happy 0.3%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Car 68.1%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

15184.
AB

Google

)5।१५.
J5184.
NACON-
)5।१५. J5184. NACON- YTERA2-MAMTZA3
YTERA2-MAMTZA3