Human Generated Data

Title

Ellen, 44 Irving Street, Cambridge, MA

Date

1971

People

Artist: Susan Meiselas, American born 1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1867

Copyright

© Susan Meiselas / Magnum

Human Generated Data

Title

Ellen, 44 Irving Street, Cambridge, MA

People

Artist: Susan Meiselas, American born 1948

Date

1971

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.7
Bedroom 97.4
Room 97.4
Indoors 97.4
Couch 95.3
Person 93.9
Human 93.9
Interior Design 91.9
Bed 91.6
Living Room 81.6
Shelf 77.9
Monitor 65.5
Display 65.5
Electronics 65.5
Screen 65.5
Chair 60.3
Undershirt 58.3
Clothing 58.3
Apparel 58.3
Dorm Room 58
Finger 56

Imagga
created on 2022-01-22

home 31.9
room 30.8
indoors 28.1
man 27.6
adult 27
people 26.8
computer 25.9
musical instrument 25.6
chair 25.4
interior 24.8
office 24.3
male 22.7
person 22.6
business 21.3
sitting 20.6
indoor 20.1
work 19.6
stringed instrument 19.1
laptop 18.9
couch 18.4
wind instrument 17.8
house 17.5
women 17.4
lifestyle 17.3
desk 16.3
businessman 15.9
table 14.9
looking 14.4
working 14.1
meeting 14.1
bowed stringed instrument 13.8
window 13.7
call 13.3
executive 13
smiling 13
corporate 12.9
men 12.9
cup 12.7
modern 12.6
happy 12.5
adults 12.3
technology 11.9
device 11.4
living 11.4
education 11.3
brass 11.1
inside 11
happiness 11
team 10.7
violin 10.5
book 10.5
furniture 10.5
communication 10.1
businesswoman 10
sofa 10
handsome 9.8
professional 9.8
together 9.6
couple 9.6
break 9.5
reading 9.5
businesspeople 9.5
child 9.4
casual 9.3
harmonica 9.3
teamwork 9.3
smile 9.3
relaxation 9.2
20s 9.2
holding 9.1
family 8.9
job 8.8
love 8.7
acoustic guitar 8.6
females 8.5
floor 8.4
leisure 8.3
phone 8.3
guitar 8.3
alone 8.2
confident 8.2
student 8.2
cheerful 8.1
group 8.1
teacher 8.1
free-reed instrument 7.9
black 7.8
portrait 7.8
relax 7.6
mature 7.4
coffee 7.4
suit 7.3
worker 7.2
face 7.1
architecture 7
glass 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

indoor 97.6
person 96.5
black and white 93.8
window 88.7
clothing 86.4
furniture 85.7
human face 77.2
text 75.1
monochrome 60.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.8%
Calm 98.9%
Confused 0.5%
Sad 0.3%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%
Happy 0%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.9%
Chair 60.3%

Captions

Microsoft

a person lying in bed next to a window 56.9%
a person lying on a bed next to a window 54.4%
a person lying on a bed 54.3%