Human Generated Data

Title

Untitled (family portrait on porch of house with children holding dolls and teddy bears)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3706

Human Generated Data

Title

Untitled (family portrait on porch of house with children holding dolls and teddy bears)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3706

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.2
Person 99.2
Furniture 99.1
Person 98.2
Person 97.6
Person 95
Accessory 88.5
Tie 88.5
Accessories 88.5
Person 84.3
People 77.2
Person 76.1
Person 74
Indoors 68.4
Room 68.4
Clothing 67.3
Apparel 67.3
Crib 66.2

Clarifai
created on 2019-06-01

people 98.5
man 93
group 92.9
child 89.2
woman 89
adult 89
group together 87.9
indoors 85.3
monochrome 82.9
room 80.1
family 73.2
sit 68.1
actor 67.4
leader 67
wear 64.2
five 63.2
education 61.4
four 61.3
administration 61
many 60.4

Imagga
created on 2019-06-01

picket fence 62.6
kin 49.9
fence 49.9
barrier 37.3
people 30.1
obstruction 25
male 22
man 21.5
couple 20.9
men 20.6
bride 17.3
person 17.2
adult 17.2
family 16
love 15.8
happy 14.4
human 14.2
portrait 14.2
happiness 14.1
wedding 13.8
home 13.5
two 13.5
women 13.4
structure 13.1
smiling 13
groom 12.8
old 12.5
smile 12.1
group 12.1
world 12
professional 11.2
dress 10.8
black 10.8
team 10.7
room 10.5
marriage 10.4
business 10.3
window 10.2
worker 10.1
mother 9.8
bouquet 9.4
musical instrument 9.4
day 9.4
instrument 9.3
life 9.1
vintage 9.1
together 8.8
chemistry 8.7
married 8.6
wife 8.5
marimba 8.4
house 8.3
fashion 8.3
percussion instrument 8.2
cheerful 8.1
interior 8
medical 7.9
indoors 7.9
urban 7.9
scene 7.8
lab 7.8
ceremony 7.8
scientific 7.7
chemical 7.7
laboratory 7.7
health 7.6
daughter 7.6
husband 7.6
senior 7.5
barbershop 7.5
city 7.5
holding 7.4
equipment 7.3
lady 7.3
new 7.3
lifestyle 7.2
romance 7.1
face 7.1
working 7.1
businessman 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

posing 97.6
window 93.4
clothing 92.7
person 90.8
old 73.5
group 64.9
human face 64.7
man 61.6
smile 51.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-59
Gender Female, 52.2%
Happy 46.5%
Confused 46.2%
Angry 45.5%
Sad 45.7%
Calm 49.9%
Surprised 45.9%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Male, 56.7%
Angry 4%
Happy 3.5%
Confused 3%
Calm 40.2%
Disgusted 2%
Sad 43.9%
Surprised 3.4%

AWS Rekognition

Age 26-43
Gender Male, 71.1%
Happy 4.8%
Surprised 3.4%
Angry 8.6%
Confused 4.3%
Calm 33.1%
Sad 43.5%
Disgusted 2.4%

AWS Rekognition

Age 26-43
Gender Female, 75.6%
Confused 1.7%
Surprised 2%
Angry 3.5%
Calm 1.8%
Disgusted 2.8%
Sad 64.2%
Happy 24%

AWS Rekognition

Age 26-43
Gender Male, 52.1%
Confused 45.1%
Disgusted 45.2%
Calm 53.4%
Angry 45.1%
Sad 45.6%
Happy 45.3%
Surprised 45.2%

Feature analysis

Amazon

Person 99.2%
Tie 88.5%

Categories

Imagga

paintings art 97.4%
text visuals 2.2%