Human Generated Data

Title

Untitled (standing couple talking to group seated at table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5329

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (standing couple talking to group seated at table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5329

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.5
Person 99.5
Person 99.3
Person 98.8
Person 98.7
Person 98.5
Person 98.2
Person 97.8
Person 97.1
Person 97
Person 95.5
Apparel 94.3
Clothing 94.3
Person 89.8
Crowd 80.9
People 79.8
Face 79.4
Person 71.8
Female 65.7
Person 65.6
Person 63.8
Girl 60.2
Suit 57.3
Coat 57.3
Overcoat 57.3

Clarifai
created on 2023-10-26

people 99.9
group 99.6
many 99.5
adult 98.4
man 97.8
woman 97.5
group together 97.3
leader 96.9
administration 96.3
crowd 94.5
music 89.9
war 87.4
audience 87.1
several 87
child 86.5
dancing 85.6
wear 82.7
musician 81.9
handshake 81.1
party 80.4

Imagga
created on 2022-01-22

people 31.8
male 29.8
man 26.9
person 26.9
team 23.3
businessman 22.9
group 22.5
business 20.6
men 20.6
work 19.6
adult 17.7
women 16.6
professional 16.1
teamwork 15.7
businesswoman 14.5
happy 13.8
worker 13.7
stage 13.5
portrait 12.9
kin 12.8
black 12.6
job 12.4
office 12
corporate 12
suit 11.7
life 11.6
together 11.4
couple 11.3
human 11.2
student 11
world 10.6
room 10.6
success 10.5
nurse 10.2
modern 9.8
working 9.7
medical 9.7
education 9.5
meeting 9.4
instrument 9.4
manager 9.3
successful 9.1
teacher 9
new 8.9
brass 8.8
looking 8.8
lab 8.7
chemistry 8.7
clothing 8.5
senior 8.4
company 8.4
silhouette 8.3
photographer 8
building 8
home 8
smiling 8
laboratory 7.7
musical instrument 7.7
wind instrument 7.7
two 7.6
ethnic 7.6
hand 7.6
plan 7.6
uniform 7.2
color 7.2
lifestyle 7.2
coat 7.2
science 7.1
medicine 7
indoors 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.3
text 99.2
clothing 94.9
outdoor 93.1
man 84
black and white 69.4
group 66.5
people 66.4
old 44.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 51.1%
Calm 77.1%
Sad 10.1%
Happy 6.4%
Surprised 2.4%
Confused 1.7%
Disgusted 1.2%
Angry 1%
Fear 0.2%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 99.8%
Confused 0.1%
Angry 0%
Sad 0%
Disgusted 0%
Happy 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 88.4%
Calm 97%
Happy 1.8%
Sad 0.6%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 54-62
Gender Male, 97.1%
Sad 58.5%
Confused 18.2%
Calm 12.5%
Surprised 4.1%
Fear 3.2%
Happy 1.3%
Angry 1.1%
Disgusted 1.1%

AWS Rekognition

Age 27-37
Gender Female, 88.5%
Calm 83.5%
Sad 8.5%
Happy 4.4%
Angry 1.6%
Confused 0.9%
Disgusted 0.5%
Surprised 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

17539.

Google

."3 5 ר( ב 5 ד = eaדAe-MANMב
."3
eaדAe-MANMב
5
ר(
ב
ד
=