Human Generated Data

Title

Untitled (woman knitting)

Date

1970s

People

Artist: Susan Meiselas, American born 1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1853

Copyright

© Susan Meiselas / Magnum

Human Generated Data

Title

Untitled (woman knitting)

People

Artist: Susan Meiselas, American born 1948

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.9
Room 99.7
Indoors 99.7
Person 99
Human 99
Bedroom 97.9
Chair 87.8
Bed 83.2
Dorm Room 82.3
Person 80.6
Crib 78.2
Nursery 56.1

Imagga
created on 2022-01-22

room 43.7
interior 36.2
chair 29
home 28.7
table 26.4
indoors 26.3
furniture 22.5
bedroom 21
people 20.6
man 20.2
house 20
bed 19.9
modern 17.5
office 16.7
person 16
inside 15.6
business 15.2
light 14.7
musical instrument 14.5
adult 14.4
chairs 13.7
male 13.6
decor 13.3
luxury 12.9
window 12.8
indoor 12.8
hotel 12.4
lifestyle 12.3
design 11.8
work 11.8
desk 11.7
couch 11.6
comfortable 11.5
floor 11.2
women 11.1
glass 10.9
smiling 10.8
lamp 10.7
style 10.4
seat 10.3
sitting 10.3
men 10.3
computer 10.3
love 10.2
teacher 10.2
decoration 10.1
relax 10.1
sofa 9.8
attractive 9.8
contemporary 9.4
happiness 9.4
elegance 9.2
laptop 9.2
relaxation 9.2
handsome 8.9
family 8.9
happy 8.8
pillow 8.7
television 8.7
couple 8.7
education 8.7
equipment 8.6
apartment 8.6
wind instrument 8.6
meeting 8.5
device 8.3
group 8.1
classroom 8
working 7.9
motel 7.9
suite 7.9
accordion 7.8
rest 7.8
corporate 7.7
empty 7.7
living 7.6
restaurant 7.5
sheet 7.5
wood 7.5
leisure 7.5
life 7.4
lady 7.3
shop 7.1
night 7.1
businessman 7.1
professional 7
together 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

indoor 99.2
wall 99
room 94
chair 91.1
floor 91.1
table 89.9
black and white 87.3
clothing 83.4
person 83.1
house 77.8
bedroom 36.9
furniture 30.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 99.9%
Happy 99.4%
Sad 0.2%
Surprised 0.2%
Calm 0%
Fear 0%
Confused 0%
Angry 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 87.8%
Bed 83.2%

Captions

Microsoft

a person sitting on a bed in a room 81.8%
a person sitting in a room 81.7%
a person sitting in a room 81.6%