Human Generated Data

Title

Untitled (kneeling man speaking with seated man at the Siesta Key Actors' Theater)

Date

1969

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11378

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (kneeling man speaking with seated man at the Siesta Key Actors' Theater)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1969

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Human 99.2
Person 99.2
Chair 97.4
Furniture 97.4
Person 96.5
Clothing 95.9
Apparel 95.9
Shoe 93.5
Footwear 93.5
Coat 79
Overcoat 79
Suit 79
Text 76.3
Female 68.8
Sitting 68.6
Photography 63.9
Photo 63.9
People 62.9

Imagga
created on 2022-01-14

man 33.6
person 32.4
people 30.7
home 28.7
male 24.9
adult 24.7
newspaper 23.8
happy 21.9
barbershop 21.1
smiling 21
shop 19.3
indoors 18.4
professional 18.4
office 17.1
product 16.8
lifestyle 16.6
couple 16.5
happiness 16.5
smile 16.4
clinic 16.2
room 16
senior 15.9
mother 15.2
talking 15.2
hospital 14.6
business 14.6
computer 14.4
portrait 14.2
family 14.2
work 14.2
businessman 14.1
clothing 14
mercantile establishment 13.8
indoor 13.7
casual 13.6
face 13.5
together 13.1
creation 13.1
looking 12.8
nurse 12.7
medical 12.4
color 12.2
mature 12.1
men 12
occupation 11.9
laptop 11.8
businesswoman 11.8
health 11.8
scholar 11.7
care 11.5
desk 11.5
sitting 11.2
women 11.1
patient 11
team 10.7
job 10.6
cheerful 10.6
doctor 10.3
corporate 10.3
20s 10.1
communication 10.1
alone 10
bright 10
discussing 9.8
discussion 9.7
interior 9.7
working 9.7
two people 9.7
group 9.7
30s 9.6
elderly 9.6
businesspeople 9.5
intellectual 9.3
phone 9.2
horizontal 9.2
place of business 9.2
worker 9.1
attractive 9.1
handsome 8.9
brunette 8.7
coat 8.7
love 8.7
father 8.5
meeting 8.5
hairdresser 8.4
child 8.4
old 8.4
house 8.4
table 7.9
life 7.9
70s 7.9
day 7.8
boy 7.8
consultant 7.8
colleagues 7.8
mid adult 7.7
expression 7.7
exam 7.7
two 7.6
manager 7.4
teamwork 7.4
teen 7.3
teenager 7.3
new 7.3
confident 7.3
student 7.2
aged 7.2
dress 7.2
to 7.1
medicine 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 97.8
person 97
clothing 93.8
man 92.5
window 92.4
human face 59

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 98%
Calm 98.1%
Happy 0.7%
Sad 0.6%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Female, 93.9%
Calm 54.4%
Sad 40.3%
Fear 1.1%
Disgusted 1%
Surprised 0.9%
Angry 0.8%
Confused 0.7%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 93.5%

Captions

Microsoft

a man and a woman sitting on a newspaper 33.6%
a man and woman sitting on a newspaper 30.5%
a man and a woman sitting in a newspaper 30.4%

Text analysis

Amazon

57859.
A 57859.
A

Google

57859.
57859.