Human Generated Data

Title

Untitled (handicapped girl lying on a cart next to a toy dog)

Date

c. 1970

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11456

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (handicapped girl lying on a cart next to a toy dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 98.8
Human 98.8
Furniture 78.7
Clothing 76.5
Apparel 76.5
Monitor 73.1
Electronics 73.1
Screen 73.1
Display 73.1
LCD Screen 73.1
Female 63.5
Girl 61.9
Computer 58.4
Finger 57.9
Indoors 56.3

Imagga
created on 2022-01-14

grand piano 56.7
piano 46.2
stringed instrument 36.5
percussion instrument 34.6
keyboard instrument 34.3
person 28.1
musical instrument 25.7
people 25.7
adult 25.6
happy 22.6
man 19.5
business 18.8
computer 18
laptop 18
office 17.7
portrait 17.5
hair 17.4
smile 17.1
lifestyle 16.6
attractive 15.4
male 14.9
world 14.7
casual 14.4
one 14.2
phone 13.8
women 13.4
smiling 13
corporate 12.9
fashion 12.8
groom 12.8
black 12.7
work 12.6
indoors 12.3
face 12.1
human 12
home 12
pretty 11.9
businesswoman 11.8
happiness 11.8
dress 11.7
model 11.7
job 11.5
modern 11.2
blond 11
relaxation 10.9
working 10.6
couple 10.5
businesspeople 10.4
communication 10.1
water 10
fun 9.7
businessman 9.7
success 9.7
sexy 9.6
looking 9.6
body 9.6
desk 9.6
clothing 9.6
love 9.5
sitting 9.4
expression 9.4
cute 9.3
relax 9.3
child 9.3
professional 9.2
cheerful 8.9
worker 8.9
lady 8.9
table 8.6
men 8.6
dance 8.6
executive 8.5
skin 8.5
back 8.3
indoor 8.2
confident 8.2
technology 8.2
suit 8.1
wet 8
interior 8
scholar 7.8
dark 7.5
room 7.5
manager 7.4
successful 7.3
alone 7.3
playing 7.3
sensual 7.3
group 7.3
director 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 97.4
black and white 93.8
person 93.2
clothing 93.2
young 80.5
girl 59.2

Face analysis

Amazon

AWS Rekognition

Age 13-21
Gender Male, 98.5%
Calm 77.8%
Sad 12.9%
Happy 3.4%
Surprised 2.8%
Confused 1%
Disgusted 0.8%
Fear 0.7%
Angry 0.6%

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a young boy standing in front of a window 41.4%
a boy standing in front of a window 41.3%
a young boy standing next to a window 31.4%

Text analysis

Amazon

45052
M
p
YT33A2
p M حس YT33A2 02240
02240
حس

Google

PMEYTA2.012A
PMEYTA2.012A