Human Generated Data

Title

Untitled (girl in chair, knitting)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16740

Human Generated Data

Title

Untitled (girl in chair, knitting)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16740

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.4
Human 98.4
Furniture 97
Bed 77.7
Female 76.2
Couch 71.4
Blonde 69.2
Teen 69.2
Kid 69.2
Child 69.2
Girl 69.2
Woman 69.2
Face 64.5
Finger 63.5
Portrait 62.9
Photography 62.9
Photo 62.9
Chair 59.3
Clothing 57.5
Apparel 57.5
Room 56.8
Indoors 56.8

Clarifai
created on 2023-10-29

people 99.8
one 97.5
monochrome 95.7
adult 95.3
nostalgia 93.8
sit 92.4
woman 92.2
child 90.2
wear 89.7
actress 89.6
retro 88.9
indoors 87.8
portrait 87.4
art 87.3
man 87.2
furniture 87.2
music 87.2
two 86.4
chair 85.9
room 85

Imagga
created on 2022-02-26

laptop 49.9
computer 43.5
person 40.4
adult 37.2
home 30.3
business 29.8
people 29
male 28.4
sitting 28.3
happy 28.2
office 27.5
working 26.5
man 26.2
businesswoman 23.6
technology 23
senior 22.5
smiling 22.4
work 22
professional 21.3
wind instrument 20.8
indoors 20.2
smile 20
corporate 19.8
looking 19.2
oboe 19
women 19
worker 18.8
portrait 18.8
businessman 18.5
musical instrument 18.1
casual 17.8
house 17.6
attractive 17.5
elderly 16.3
executive 16.2
brass 16.1
modern 15.4
lady 15.4
communication 15.1
job 15
face 14.2
holding 14
mature 13.9
lifestyle 13.7
successful 13.7
reading 13.3
old 13.2
men 12.9
one 12.7
student 12.7
handsome 12.5
couple 12.2
room 12.2
scholar 12.1
success 12.1
notebook 11.9
sofa 11.6
couch 11.6
desk 11.6
book 11.6
businesspeople 11.4
cheerful 11.4
education 11.3
pretty 11.2
phone 11.1
alone 11
indoor 11
confident 10.9
workplace 10.5
device 10.4
teacher 10.3
manager 10.2
happiness 10.2
chair 10
browsing 9.8
living room 9.8
human 9.7
table 9.7
intellectual 9.7
retirement 9.6
wireless 9.5
hair 9.5
relax 9.3
horizontal 9.2
blond 9.1
suit 9.1
group 8.9
retired 8.7
using 8.7
model 8.6
company 8.4
life 8.4
studio 8.4
monitor 8.2
woodwind 8.1
interior 8
together 7.9
look 7.9
seated 7.8
corporation 7.7
talking 7.6
thinking 7.6
mobile 7.5
meeting 7.5
relaxed 7.5
leisure 7.5
study 7.5
glasses 7.4
call 7.3
black 7.2
team 7.2
secretary 7.1
guitar 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

wall 98.1
text 97.5
sitting 96.9
indoor 95
person 83.3
clothing 79.9
drawing 79
book 68.7
black and white 64.5
human face 61.9
old 57.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 99.5%
Sad 50.5%
Happy 22.2%
Calm 13.4%
Angry 5%
Confused 3.6%
Disgusted 2.4%
Surprised 1.6%
Fear 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Bed
Person 98.4%
Bed 77.7%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

re
YТ3°-

Google

re wwwE
re
wwwE