Human Generated Data

Title

Untitled (woman getting a manicure while her hair is being dried)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8442

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman getting a manicure while her hair is being dried)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Furniture 98.4
Person 98.1
Human 98.1
Sitting 97.3
Person 95.8
Person 92.7
Chair 92.1
Text 67
Couch 63.5
Portrait 61.1
Photography 61.1
Face 61.1
Photo 61.1
Cafe 58.9
Restaurant 58.9
Clothing 57.8
Apparel 57.8
Studio 57.5

Imagga
created on 2022-01-15

man 28.2
people 27.3
person 24.4
male 24.1
barbershop 20.8
adult 20.2
shop 20.2
men 19.7
business 18.8
black 15.6
sax 15.6
office 15.5
room 14.9
mercantile establishment 14.6
businessman 14.1
work 12.8
job 12.4
professional 12.1
lifestyle 11.6
newspaper 11.5
working 11.5
chair 11.4
wind instrument 11.4
fashion 11.3
smile 10.7
brass 10.7
computer 10.5
occupation 10.1
equipment 9.9
modern 9.8
human 9.7
place of business 9.7
portrait 9.7
musical instrument 9.6
home 9.6
women 9.5
music 9.1
dress 9
worker 9
technology 8.9
style 8.9
interior 8.8
light 8.7
product 8.6
sitting 8.6
player 8.5
house 8.3
hand 8.3
alone 8.2
family 8
smiling 8
indoors 7.9
desk 7.9
happiness 7.8
concert 7.8
space 7.8
attractive 7.7
musician 7.7
youth 7.7
old 7.7
dance 7.6
elegance 7.6
monitor 7.5
manager 7.4
silhouette 7.4
instrument 7.4
holding 7.4
laptop 7.4
entertainment 7.4
back 7.3
building 7.3
lady 7.3
group 7.2
world 7.2
art 7.2
love 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.4
man 90.4
clothing 78.9
person 75.2
furniture 65.1
drawing 54.6

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 71.2%
Calm 87.7%
Angry 3.2%
Happy 3%
Sad 2.4%
Surprised 2.3%
Confused 0.6%
Disgusted 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Chair 92.1%

Captions

Microsoft

a man sitting in front of a building 69.3%
a man standing in front of a building 69.1%
a man sitting in a chair in front of a building 69%

Text analysis

Amazon

12561
12561.
DOOX<

Google

12561. 12561 19521
12561.
12561
19521