Human Generated Data

Title

Untitled (woman reading on bunk with other seated woman, Ringling Brothers)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4563

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman reading on bunk with other seated woman, Ringling Brothers)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 97.6
Human 97.6
Clothing 96.5
Apparel 96.5
Person 92.9
Furniture 81
Female 77.4
Tub 69.9
Helmet 62.6
Girl 61.1
Swimwear 59.3
Art 58.9
Bed 58.8
Woman 57.8
Head 57.1
Shorts 57.1

Imagga
created on 2022-02-05

television 43.7
person 32.8
laptop 31.3
telecommunication system 28.2
sitting 27.5
people 27.3
adult 26.6
computer 22.4
working 20.3
lifestyle 20.2
smile 19.9
happy 19.4
one 19.4
women 19
work 18.8
attractive 18.2
technology 17.8
notebook 17.8
business 17.6
pretty 17.5
smiling 17.4
wireless 17.2
indoors 16.7
casual 16.1
home 15.1
portrait 14.9
model 14.8
lady 14.6
sexy 14.4
hair 14.3
office 13.7
fashion 13.6
job 13.3
success 12.9
man 12.8
relaxation 12.6
worker 12.4
monitor 12.4
brunette 12.2
male 12.1
businesswoman 11.8
communication 11.7
clothing 11.6
cheerful 11.4
corporate 11.2
student 10.9
desk 10.5
professional 10.3
lying 10.3
executive 10.1
relaxing 10
negative 10
modern 9.8
interior 9.7
sofa 9.7
room 9.6
using 9.6
looking 9.6
education 9.5
blond 9.4
happiness 9.4
film 9.4
phone 9.2
suit 9
body 8.8
typing 8.8
elegant 8.6
relax 8.4
electronic equipment 8.4
car 8.3
human 8.2
indoor 8.2
equipment 8.2
pose 8.1
cute 7.9
face 7.8
full length 7.8
resting 7.6
table 7.4
successful 7.3
sensuality 7.3
screen 7.3
photographic paper 7.2
broadcasting 7.2

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
drawing 90.5
person 89
sketch 82.1
black and white 65.4

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 54.1%
Happy 85.4%
Sad 9.7%
Calm 2.7%
Confused 0.6%
Angry 0.5%
Disgusted 0.4%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 94.5%
Sad 82.6%
Calm 13.9%
Confused 0.8%
Disgusted 0.8%
Fear 0.6%
Angry 0.5%
Surprised 0.5%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Helmet 62.6%

Text analysis

Amazon

16217.
True
5
Love-Romance True
Love-Romance
RISSES
STYLE RISSES
15217.
MIIT
STYLE
MIIT ПТАЯТIИ E70A
ПТАЯТIИ
E70A

Google

16217.
16207: 16217. 16217.
16207: