Human Generated Data

Title

Untitled (woman playing the accordian at the Siesta Key Actors' Theater)

Date

1969

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11383

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing the accordian at the Siesta Key Actors' Theater)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1969

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Imagga
created on 2022-01-14

newspaper 100
product 100
creation 87.3
people 24.5
man 20.8
home 18.3
daily 18
male 17
old 16.7
business 16.4
person 16
lifestyle 15.2
portrait 14.2
businessman 12.4
adult 12.3
sitting 12
looking 12
vintage 11.6
smiling 11.6
office 11.4
happy 10.6
ancient 10.4
women 10.3
room 10.2
work 10.2
house 10
smile 10
history 9.8
interior 9.7
indoors 9.7
couple 9.6
money 9.4
aged 9
computer 8.8
face 8.5
senior 8.4
attractive 8.4
indoor 8.2
lady 8.1
drawing 8.1
symbol 8.1
family 8
shop 7.9
architecture 7.8
art 7.8
sculpture 7.6
two 7.6
finance 7.6
holding 7.4
technology 7.4
retro 7.4
cup 7.2
kitchen 7.2
mother 7.1
job 7.1
working 7.1
day 7.1
table 7
modern 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 99.3
person 96.7
clothing 87.4
woman 58.3

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 99.4%
Calm 99.4%
Happy 0.3%
Sad 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 49-57
Gender Female, 51.4%
Happy 49.5%
Calm 22%
Surprised 12.8%
Sad 7.8%
Confused 3.3%
Angry 2.3%
Disgusted 1.3%
Fear 1%

AWS Rekognition

Age 21-29
Gender Female, 78.2%
Calm 99.6%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Sad 0%
Confused 0%
Angry 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a man and a woman standing in front of a window 73.3%
a man and a woman standing next to a window 72.8%
a man and woman standing in front of a window 63.4%

Text analysis

Amazon

57853.
MJI7--YT3RA°S
KOOK

Google

59853. NAGOX
59853.
NAGOX