Human Generated Data

Title

Untitled (man with two children next to fireplace)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10554

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man with two children next to fireplace)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10554

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Shorts 100
Clothing 100
Apparel 100
Person 99.8
Human 99.8
Person 99.5
Person 98.7
Female 96.6
Footwear 91.7
Shoe 91.7
Woman 87.6
Person 86.4
Blonde 83.3
Girl 83.3
Kid 83.3
Teen 83.3
Child 83.3
Skirt 82.5
Dress 77.8
Face 75.4
Shoe 71.3
Indoors 70.9
People 66.8
Portrait 66.3
Photography 66.3
Photo 66.3
Furniture 62.6
Suit 61.2
Coat 61.2
Overcoat 61.2
Screen 58.8
Electronics 58.8
Tree 58.4
Plant 58.4
Collage 57.2
Advertisement 57.2
Poster 57.2
Man 55.9
Fireplace 55.8
Monitor 55.5
Display 55.5

Clarifai
created on 2023-10-25

people 99.9
child 98.3
group together 95.8
man 94.7
group 94
dancing 93.3
adult 93.1
boy 92.5
two 90.8
woman 90.2
three 89.8
music 89.6
boxer 87.6
education 87.3
wear 87.2
several 86.9
dancer 85.4
recreation 85.3
family 83.7
four 83

Imagga
created on 2022-01-09

newspaper 34.5
product 26.5
people 25.1
man 22.2
person 21.7
creation 21.2
adult 21
world 17
business 17
black 16.8
women 16.6
city 15.8
teacher 15.1
male 15
portrait 14.9
urban 14
pretty 11.9
dress 11.7
building 11.7
businessman 11.5
human 11.2
outdoors 11.2
professional 11.2
two 11
educator 10.8
face 10.7
interior 10.6
travel 10.6
men 10.3
motion 10.3
youth 10.2
lifestyle 10.1
sport 9.9
lady 9.7
hair 9.5
dancer 9.4
exercise 9.1
fashion 9
performer 8.8
happy 8.8
smiling 8.7
work 8.6
walking 8.5
casual 8.5
head 8.4
silhouette 8.3
indoor 8.2
one 8.2
life 8.1
activity 8.1
room 8
cute 7.9
couple 7.8
smile 7.8
model 7.8
corporate 7.7
office 7.7
attractive 7.7
move 7.7
communication 7.6
legs 7.5
light 7.4
occupation 7.3
design 7.3
girls 7.3
group 7.3
architecture 7.2
transportation 7.2
job 7.1
working 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 97.7
text 97
footwear 94
person 91.9
black and white 52.5
woman 52.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 41-49
Gender Male, 96.7%
Happy 51.1%
Calm 38.7%
Surprised 3.6%
Angry 2.3%
Sad 2%
Disgusted 1%
Fear 0.9%
Confused 0.4%

AWS Rekognition

Age 37-45
Gender Male, 55.2%
Happy 49.6%
Sad 16.5%
Calm 15.6%
Surprised 11%
Fear 3.5%
Disgusted 1.7%
Angry 1.2%
Confused 1%

AWS Rekognition

Age 28-38
Gender Female, 70.7%
Calm 67.2%
Sad 28%
Confused 1.5%
Disgusted 1.4%
Fear 0.7%
Happy 0.5%
Surprised 0.4%
Angry 0.3%

Feature analysis

Amazon

Person 99.8%
Shoe 91.7%

Text analysis

Amazon

20527.
20529.

Google

20527: १
20527: