Human Generated Data

Title

Untitled (woman sitting at piano pointing)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4410

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman sitting at piano pointing)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.5
Human 98.5
Electronics 93.7
Person 92
Person 91.1
Person 90.3
Person 86.5
Keyboard 83.5
Person 83.4
Person 78.3
Person 74.4
Person 70.6
Glasses 63.3
Accessory 63.3
Accessories 63.3
Leisure Activities 59.2
Person 49.2
Person 48.2

Imagga
created on 2022-01-23

negative 100
film 84.8
photographic paper 63.3
photographic equipment 42.2
people 30.1
computer 23.2
office 22.6
person 22
indoors 22
man 21.5
work 21.2
adult 19.6
business 19.4
portrait 19.4
laptop 19.1
senior 18.7
looking 18.4
old 18.1
male 17.7
professional 17.3
working 16.8
camera 16.6
sitting 16.3
smiling 15.2
happy 15
keyboard 15
technology 14.8
face 14.2
elderly 13.4
room 12.9
home 12.8
newspaper 12.5
desk 12.4
lifestyle 12.3
one 11.9
day 11.8
worker 11.6
smile 11.4
attractive 11.2
casual 11
finance 11
alone 11
indoor 11
businesswoman 10.9
businessman 10.6
medical 10.6
education 10.4
doctor 10.3
mature 10.2
health 9.7
retired 9.7
retirement 9.6
hair 9.5
businesspeople 9.5
corporate 9.4
one person 9.4
money 9.4
horizontal 9.2
occupation 9.2
product 9.2
hand 9.1
job 8.8
table 8.8
creation 8.7
laboratory 8.7
men 8.6
serious 8.6
instrument 8.5
notebook 8.4
modern 8.4
glasses 8.3
holding 8.3
human 8.2
aged 8.1
daily 8.1
medicine 7.9
women 7.9
happiness 7.8
lab 7.8
equipment 7.7
pretty 7.7
using 7.7
light 7.3
student 7.2
pensioner 7.2
history 7.2
bright 7.1
copy 7.1
coat 7.1

Microsoft
created on 2022-01-23

text 98.3
person 92.3
clothing 82.8
black and white 74
human face 66.4
woman 53.9
store 38.5

Face analysis

Amazon

AWS Rekognition

Age 51-59
Gender Female, 96.6%
Calm 97.5%
Happy 1%
Confused 0.4%
Surprised 0.3%
Disgusted 0.3%
Sad 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Female, 73.6%
Calm 80.9%
Sad 9.7%
Fear 3.4%
Happy 1.9%
Confused 1.4%
Angry 1%
Surprised 1%
Disgusted 0.8%

AWS Rekognition

Age 18-26
Gender Male, 84.8%
Calm 60%
Sad 13%
Fear 9.5%
Disgusted 5%
Surprised 3.9%
Happy 3.7%
Angry 3%
Confused 1.9%

AWS Rekognition

Age 31-41
Gender Female, 99.6%
Calm 99.2%
Surprised 0.3%
Angry 0.1%
Happy 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 18-26
Gender Female, 98%
Fear 66.8%
Surprised 19.7%
Sad 5.8%
Calm 2.3%
Confused 2%
Angry 1.4%
Happy 1.2%
Disgusted 0.9%

AWS Rekognition

Age 49-57
Gender Male, 88.2%
Calm 73.4%
Happy 8.2%
Sad 5.3%
Confused 4.6%
Fear 3.3%
Disgusted 2%
Surprised 1.7%
Angry 1.4%

AWS Rekognition

Age 19-27
Gender Female, 90.6%
Sad 71.8%
Calm 22.7%
Fear 2%
Happy 0.9%
Angry 0.8%
Disgusted 0.8%
Surprised 0.5%
Confused 0.5%

AWS Rekognition

Age 31-41
Gender Female, 96.3%
Calm 79.3%
Happy 12.2%
Surprised 6.1%
Angry 1%
Fear 0.6%
Disgusted 0.4%
Sad 0.4%
Confused 0.1%

Feature analysis

Amazon

Person 98.5%
Glasses 63.3%

Captions

Microsoft

a group of people in a store 44.4%
a group of people standing in front of a store 44.3%
a person standing in front of a crowd 44.2%

Text analysis

Amazon

17244.
BE
NAOCH
U

Google

17244. 17246 192
192
17244.
17246