Human Generated Data

Title

Untitled (man and woman touch trunk of elephant; soldier watch at left)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4909

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman touch trunk of elephant; soldier watch at left)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4909

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.4
Person 98.9
Elephant 96.9
Animal 96.9
Mammal 96.9
Wildlife 96.9
Clothing 96.1
Apparel 96.1
Elephant 88.5
Person 84.8
Blonde 83.6
Teen 83.6
Female 83.6
Girl 83.6
Kid 83.6
Woman 83.6
Child 83.6
Outdoors 83.5
Elephant 82.5
Shelter 81.7
Building 81.7
Rural 81.7
Nature 81.7
Countryside 81.7
Shoe 81.7
Footwear 81.7
Face 78.9
Shorts 76.8
Pants 74.3
People 71.1
Text 69.8
Dress 67.5
Crowd 64.3
Suit 61.8
Overcoat 61.8
Coat 61.8
Skin 59.3
Drawing 56.7
Art 56.7
Furniture 56.6
Table 56.3
Floor 55.8
Paper 55.7

Clarifai
created on 2023-10-27

people 99.7
group 96.3
man 94.5
education 94.3
monochrome 93
adult 92.9
woman 90.1
child 85
teacher 83.7
art 81.4
many 80.7
illustration 79.3
school 77.8
dancing 74.8
crowd 71
music 70.1
musician 67.4
actor 66.9
audience 62.3
uniform 62

Imagga
created on 2022-01-23

graffito 47.6
decoration 31.6
freight car 23.4
car 21.5
people 19
art 16.6
person 15.9
wheeled vehicle 15
dark 13.4
billboard 13.1
symbol 12.8
grunge 12.8
silhouette 12.4
vehicle 12.1
man 12.1
design 11.8
crowd 11.5
stage 11.1
event 11.1
business 10.9
signboard 10.6
style 10.4
men 10.3
flag 9.9
old 9.7
player 9.7
black 9.6
sport 9.5
scene 9.5
graphic 9.5
male 9.2
city 9.1
film 9.1
fashion 9
team 9
structure 8.7
light 8.7
party 8.6
astronaut 8.2
drawing 8.1
music 8.1
sexy 8
businessman 7.9
negative 7.9
urban 7.9
sky 7.6
world 7.4
park 7.4
street 7.4
daily 7.3
paint 7.2
success 7.2
color 7.2
body 7.2
adult 7.2
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
outdoor 95.5
clothing 88.3
person 79.5
drawing 78.7
posing 74
black and white 63.1
footwear 55.1
woman 50.8
old 47.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 75.2%
Calm 97.1%
Sad 2.4%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0%
Happy 0%

AWS Rekognition

Age 23-33
Gender Female, 93.1%
Calm 95.9%
Sad 1.2%
Surprised 1%
Happy 0.8%
Disgusted 0.5%
Confused 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Female, 99.9%
Calm 94.7%
Happy 3%
Surprised 1.1%
Sad 0.4%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.7%
Sad 63.9%
Calm 28%
Happy 6%
Angry 1%
Confused 0.4%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Elephant 96.9%
Shoe 81.7%

Categories

Text analysis

Amazon

19011.
١٩٥١١٠

Google

19001
19001