Human Generated Data

Title

Untitled (smiling woman at piano)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4439

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (smiling woman at piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.1
Human 99.1
Electronics 95.4
Person 92.4
Person 90.5
Keyboard 88
Person 82.5
Person 81.1
Person 74.2
Person 65.6
Person 63.5
Person 59.4

Imagga
created on 2022-01-23

negative 75.4
film 60.6
photographic paper 45.4
photographic equipment 30.2
money 21.3
man 19
business 18.8
cash 18.3
people 17.3
person 17.2
currency 16.1
male 15.6
newspaper 15.3
banking 14.7
bank 14.4
product 14.1
dollar 13.9
creation 13.2
finance 12.7
pay 12.5
work 11.8
banknotes 11.7
portrait 11.6
savings 11.2
rich 11.2
old 11.1
professional 10.7
financial 10.7
face 10.7
hair 10.3
musical instrument 10.2
paper 10.2
wealth 9.9
businessman 9.7
payment 9.6
loan 9.6
exchange 9.5
architecture 9.4
adult 9.4
commerce 9.3
smile 9.3
slick 9
franklin 8.9
looking 8.8
happy 8.8
economic 8.7
bill 8.6
daily 8.5
black 8.4
human 8.2
technology 8.2
closeup 8.1
handsome 8
market 8
medical 7.9
building 7.9
indoors 7.9
laboratory 7.7
us 7.7
test 7.7
attractive 7.7
price 7.7
health 7.6
instrument 7.6
doctor 7.5
senior 7.5
one 7.5
student 7.2
medicine 7

Microsoft
created on 2022-01-23

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 63.8%
Calm 64.2%
Confused 19.3%
Surprised 14.1%
Sad 0.7%
Angry 0.5%
Fear 0.5%
Disgusted 0.5%
Happy 0.3%

AWS Rekognition

Age 16-24
Gender Male, 90.4%
Calm 32.9%
Disgusted 20.1%
Sad 17.2%
Surprised 9.7%
Fear 8.3%
Angry 5.7%
Happy 5%
Confused 1.2%

AWS Rekognition

Age 19-27
Gender Female, 66.7%
Calm 94.3%
Sad 3.8%
Happy 0.7%
Confused 0.5%
Angry 0.3%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 16-24
Gender Male, 61.2%
Calm 97.3%
Sad 1.7%
Surprised 0.3%
Angry 0.2%
Happy 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 28-38
Gender Female, 56.8%
Calm 54.6%
Surprised 18.7%
Happy 13.6%
Disgusted 3%
Confused 2.7%
Sad 2.6%
Angry 2.4%
Fear 2.4%

AWS Rekognition

Age 23-31
Gender Female, 57.9%
Calm 54.1%
Sad 11.4%
Happy 10.5%
Disgusted 10%
Confused 5.7%
Angry 3.4%
Surprised 2.9%
Fear 2.1%

AWS Rekognition

Age 45-53
Gender Male, 83.5%
Surprised 57.9%
Calm 28.7%
Confused 4.5%
Fear 3.2%
Disgusted 1.9%
Happy 1.8%
Angry 1.3%
Sad 0.7%

AWS Rekognition

Age 24-34
Gender Female, 99.3%
Calm 53.4%
Confused 19.2%
Angry 9.5%
Surprised 8.9%
Happy 3.8%
Sad 2.2%
Disgusted 1.8%
Fear 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

people around each other 54.5%

Text analysis

Amazon

3
17243.
19243.
88
1724 3
1724

Google

VAMTRA n243. 17242
VAMTRA
n243.
17242