Human Generated Data

Title

Untitled (two women and a man leaning over a railing)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8319

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a man leaning over a railing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8319

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 98.9
Person 98.7
Clothing 98.1
Apparel 98.1
Tie 96.5
Accessories 96.5
Accessory 96.5
Railing 92.1
Coat 88.3
Overcoat 70.4
Suit 69.5
Handrail 57.1
Banister 57.1

Clarifai
created on 2023-10-25

people 99.7
group 97.8
adult 97.5
woman 96.7
portrait 96.4
monochrome 95.8
man 95.3
musician 94.6
singer 93
music 92.7
leader 88.8
group together 87.3
jazz 81.4
facial expression 81.1
audience 80.7
instrument 80.3
several 79.8
five 79.6
actor 79.1
microphone 79.1

Imagga
created on 2022-01-08

marimba 100
percussion instrument 100
musical instrument 100
man 53.8
vibraphone 49
male 44
businessman 38.9
office 36.2
business 35.8
people 34
adult 33.3
person 31.7
professional 31
work 25.1
mature 24.2
businesspeople 23.7
colleagues 23.3
smiling 23.2
meeting 22.6
happy 22.6
table 22.5
corporate 22.3
team 21.5
sitting 21.5
men 21.5
handsome 21.4
desk 20.8
executive 20.5
portrait 20.1
businesswoman 20
job 19.5
device 19
worker 18.7
senior 17.8
working 17.7
smile 17.1
looking 16.8
manager 16.8
indoors 16.7
laptop 16.4
suit 16.2
group 16.1
casual 16.1
room 15.6
computer 15.2
doctor 15
20s 14.7
associates 13.8
coworkers 13.8
lifestyle 13.7
confident 13.7
face 13.5
modern 13.3
talking 13.3
cheerful 13
education 13
occupation 12.8
color 12.8
indoor 12.8
teamwork 12.1
40s 11.7
discussion 11.7
patient 11.6
staff 11.5
medical 11.5
day 11
corporation 10.6
busy 10.6
together 10.5
couple 10.5
tie 10.4
career 10.4
shirt 10.3
horizontal 10.1
classroom 9.9
teacher 9.8
conference 9.8
success 9.7
30s 9.6
women 9.5
communication 9.2
friendly 9.1
holding 9.1
health 9
human 9
25 30 years 8.8
businessmen 8.8
standing 8.7
elderly 8.6
profession 8.6
serious 8.6
hospital 8.5
presentation 8.4
company 8.4
successful 8.2
technology 8.2
bright 7.9
black 7.8
mid adult 7.7
leader 7.7
stethoscope 7.7
jacket 7.7
attractive 7.7
old 7.7
two 7.6
coffee 7.4
camera 7.4
paper 7.1
happiness 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.1
person 92.8
clothing 89.6
posing 87.8
man 87
black and white 80.3
white 67
human face 50.9
old 45.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 98.8%
Happy 59.1%
Surprised 21.8%
Calm 15.5%
Confused 1.5%
Disgusted 1%
Sad 0.4%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 98.4%
Surprised 39.8%
Happy 35.7%
Calm 16.1%
Sad 4.5%
Fear 1.5%
Angry 0.9%
Disgusted 0.8%
Confused 0.7%

AWS Rekognition

Age 40-48
Gender Male, 69.9%
Surprised 77.5%
Happy 7.5%
Calm 6.2%
Sad 4.1%
Angry 1.3%
Disgusted 1.2%
Fear 1.2%
Confused 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 96.5%

Text analysis

Amazon

10126.
10126
A7DA
NA993902 A7DA
NA993902

Google

10126.
10126. 10126. 10126.