Human Generated Data

Title

Untitled (family portrait on porch of home with children holding dolls and teddy bears)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3707

Human Generated Data

Title

Untitled (family portrait on porch of home with children holding dolls and teddy bears)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3707

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Furniture 99.6
Person 99.6
Human 99.6
Person 99.5
Person 99
Person 98.1
Clothing 96.2
Apparel 96.2
Person 91.5
Person 81.3
Person 78.1
People 74.3
Face 69.5
Person 62.1
Kid 61.9
Child 61.9
Porch 59.6
Indoors 59
Hat 56.7
Steamer 55.1
Chair 51.8

Clarifai
created on 2019-06-01

people 99.4
adult 95.8
child 95.3
man 94.4
group 93.8
indoors 92.2
woman 91.1
room 88.3
family 88
group together 86.8
monochrome 85.2
sit 80.9
wear 80.1
furniture 79.2
facial expression 73.9
four 72.2
chair 72
leader 68.2
five 67.9
actor 67

Imagga
created on 2019-06-01

picket fence 52.8
fence 42.3
people 32.9
barrier 31.6
kin 29.3
male 24.8
man 22.8
person 22.7
obstruction 21.2
adult 20.3
professional 20
medical 18.5
men 17.2
portrait 16.8
business 15.8
human 15.7
laboratory 15.4
team 15.2
doctor 15
biology 14.2
group 13.7
lab 13.6
film 13.3
negative 13.1
worker 13
office 12.8
scientist 12.7
nurse 12.6
test 12.5
health 12.5
family 12.4
research 12.4
businessman 12.4
medicine 12.3
smiling 12.3
corporate 12
modern 11.9
work 11.9
casual 11.9
scientific 11.6
job 11.5
occupation 11
day 11
indoor 10.9
chemical 10.8
smile 10.7
colleagues 10.7
science 10.7
face 10.6
happy 10.6
chemistry 10.6
indoors 10.5
hospital 10.4
businesspeople 10.4
looking 10.4
instrument 10.2
structure 10.2
20s 10.1
clinic 9.9
working 9.7
coat 9.7
development 9.6
women 9.5
happiness 9.4
musical instrument 9.3
patient 9.1
confident 9.1
fashion 9
color 8.9
brass 8.8
home 8.8
love 8.7
lifestyle 8.7
life 8.6
sitting 8.6
window 8.5
room 8.4
black 8.4
attractive 8.4
old 8.4
wind instrument 8.2
child 8.2
businesswoman 8.2
photographic paper 8.1
glass 8
microscope 7.9
education 7.8
assistant 7.8
exam 7.7
confidence 7.7
study 7.5
equipment 7.4
teamwork 7.4
care 7.4
light 7.3
design 7.3
clothing 7.3
student 7.2
dress 7.2
mother 7.1
handsome 7.1
interior 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 98
window 95.2
posing 93.2
person 85.9
smile 80.6
man 74
human face 71.5
old 60.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 51.7%
Confused 45.2%
Happy 45.2%
Surprised 45.3%
Angry 45.4%
Disgusted 45.1%
Calm 50.6%
Sad 48.2%

AWS Rekognition

Age 26-43
Gender Female, 54%
Angry 45.7%
Happy 48.7%
Confused 45.3%
Sad 49%
Calm 45.8%
Surprised 45.3%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Angry 45.5%
Sad 49.9%
Disgusted 45.5%
Happy 46.4%
Calm 46.7%
Surprised 45.6%
Confused 45.5%

AWS Rekognition

Age 23-38
Gender Male, 53.7%
Angry 45.4%
Sad 49.1%
Disgusted 45.2%
Surprised 45.3%
Happy 45.4%
Calm 49.3%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 75%
Sad 42.4%
Happy 11.9%
Confused 3.6%
Angry 3.9%
Surprised 3.8%
Disgusted 4.2%
Calm 30.1%

AWS Rekognition

Age 23-38
Gender Female, 52.6%
Angry 45.3%
Happy 45%
Confused 45.5%
Calm 52.9%
Disgusted 45.1%
Sad 46%
Surprised 45.2%

Feature analysis

Amazon

Person 99.6%
Chair 51.8%

Categories

Imagga

paintings art 97.4%
text visuals 2.3%