Human Generated Data

Title

Untitled (couple talking in front of Heinz display)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4429

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple talking in front of Heinz display)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4429

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 97.8
Clothing 94.7
Apparel 94.7
Person 94.5
Person 91.7
Sleeve 77.5
Clinic 67.1
Coat 66.8
Crowd 62.3
Hair 60.4
Long Sleeve 58.2
Text 57.2
Doctor 57
Hairdresser 55.3
Worker 55.3
Person 51.4

Clarifai
created on 2023-10-26

people 99.4
woman 98.4
man 98
adult 94
indoors 88.3
group 86
monochrome 84.4
child 84
family 81.9
touch 81.3
illustration 79.2
option 78.6
healthcare 75.1
science 75
connection 73.7
uniform 70.8
business 70.6
education 70.5
two 70.1
groom 69.8

Imagga
created on 2022-01-23

man 30.9
people 30.1
business 25.5
male 24.2
person 23
adult 19.8
work 19.6
businessman 19.4
professional 19
office 16.4
team 16.1
job 15.9
men 15.4
blackboard 15.4
world 14.9
hand 14.6
corporate 14.6
modern 14
education 13.8
worker 13.4
room 13
human 12.7
teacher 12.2
group 12.1
black 12
instrument 11.7
happy 11.3
casual 11
businesswoman 10.9
meeting 10.4
teamwork 10.2
student 10.2
board 9.9
success 9.7
technology 9.6
hands 9.6
manager 9.3
finance 9.3
holding 9.1
portrait 9.1
scientist 8.8
looking 8.8
assistant 8.7
women 8.7
communication 8.4
attractive 8.4
study 8.4
silhouette 8.3
classroom 8.2
executive 8
medical 7.9
employee 7.8
scientific 7.7
chemistry 7.7
class 7.7
two 7.6
career 7.6
sign 7.5
presentation 7.4
company 7.4
globe 7.4
occupation 7.3
indoor 7.3
new 7.3
school 7.3
music 7.2
life 7.2
smile 7.1
working 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.8
clothing 95.1
person 90.5
man 79.7
human face 71.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 87.9%
Calm 98.6%
Angry 0.5%
Sad 0.3%
Fear 0.2%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 13-21
Gender Female, 92.1%
Calm 96.6%
Surprised 1.8%
Sad 0.7%
Angry 0.4%
Happy 0.3%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Female, 97.3%
Angry 39%
Surprised 28.5%
Happy 12.1%
Calm 9.1%
Fear 5.7%
Disgusted 2.4%
Sad 1.6%
Confused 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 98.2%
interior objects 1.1%

Text analysis

Amazon

HEINZ
17295.
17295. AOOX
DLANS
VI37A2
VI37A2 -
AOOX
562L1
-

Google

17295 ANS HEINZ 17295.
ANS
17295.
17295
HEINZ