Human Generated Data

Title

Untitled (woman in chair)

Date

1953

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20067

Human Generated Data

Title

Untitled (woman in chair)

People

Artist: Peter James Studio, American

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.5
Apparel 99.5
Chair 98.7
Furniture 98.7
Human 97.2
Person 97.2
Architecture 92
Building 92
Home Decor 89.2
Pillar 84
Column 84
Indoors 72
Interior Design 72
Footwear 70.9
Shoe 70.9
Portrait 61.6
Photography 61.6
Face 61.6
Photo 61.6
Floor 61.3
Coat 59.2
Suit 57.3
Overcoat 57.3
Female 57.1
Door 55.8

Imagga
created on 2022-03-05

turnstile 40
gate 32.4
movable barrier 24.6
chair 23
fashion 21.1
barber chair 20.6
portrait 19.4
adult 17.5
attractive 17.5
man 17.5
city 17.5
seat 17.2
people 16.7
barrier 16.7
person 16.5
pretty 16.1
black 15.8
dress 15.4
posing 14.2
model 14
urban 14
building 13.8
sexy 13.6
male 13.5
style 13.3
smile 12.1
lady 11.4
sensuality 10.9
pose 10.9
happy 10.6
legs 10.4
professional 10.2
youth 10.2
street 10.1
face 9.9
interior 9.7
one 9.7
body 9.6
home 9.6
worker 9.5
hair 9.5
furniture 9.5
architecture 9.4
elegance 9.2
business 9.1
indoors 8.8
clothing 8.8
brunette 8.7
standing 8.7
lifestyle 8.7
sculpture 8.6
leg 8.6
newspaper 8.5
obstruction 8.5
human 8.2
life 8
smiling 8
women 7.9
cute 7.9
statue 7.9
art 7.8
travel 7.7
old 7.7
career 7.6
stylish 7.2
family 7.1
happiness 7
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

floor 96.1
furniture 95.3
indoor 94.5
text 93.4
woman 92.1
person 90.9
black and white 83.9
chair 82.7
clothing 81.1
piano 72.7
footwear 71.6
table 69.3

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 95.1%
Surprised 92.7%
Happy 3.8%
Calm 1.2%
Fear 0.9%
Confused 0.8%
Sad 0.3%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%

Captions

Microsoft

a woman wearing a dress 87.4%
a woman standing in a room 87.3%
a woman wearing a costume 80.3%

Text analysis

Amazon

KODAK-A--EITW

Google

2--XAGON
MJI7--YT3RA
MJI7--YT3RA 2--XAGON