Human Generated Data

Title

Untitled (eight men and one woman standing and seated on porch)

Date

c. 1905

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3899

Human Generated Data

Title

Untitled (eight men and one woman standing and seated on porch)

People

Artist: Durette Studio, American 20th century

Date

c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3899

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.7
Human 99.7
Person 99.3
Person 98.9
Person 97.4
Person 97
Person 95.8
Person 95.4
Person 94.4
Clothing 89.2
Apparel 89.2
People 87.3
Chair 86.6
Furniture 86.6
Person 84.6
Indoors 82.5
Room 82.5
Person 76.6
Suit 71.1
Coat 71.1
Overcoat 71.1
Clinic 70.5
Shorts 62.1
Accessories 60.2
Tie 60.2
Accessory 60.2
Window 59.5
Classroom 59.4
School 59.4
Porch 57.2
Table 57

Clarifai
created on 2019-06-01

people 99.8
group together 99
group 98.9
adult 97.9
man 96
wear 95
woman 94.9
several 92.4
many 91
music 89.6
leader 89.1
four 89.1
five 87.6
monochrome 87.5
outfit 85.1
musician 83
three 82.7
administration 81.5
facial expression 79.5
medical practitioner 78.3

Imagga
created on 2019-06-01

kin 53.4
barbershop 45.3
shop 37.3
mercantile establishment 28.8
people 25.1
man 23.5
couple 23.5
male 22
place of business 19.2
happy 17.5
men 16.3
happiness 15.7
adult 15
person 14.7
bride 14.4
home 13.5
love 13.4
family 13.3
nurse 12.9
dress 12.6
smiling 12.3
portrait 12.3
wedding 11.9
groom 11.8
two 11
business 10.9
black 10.8
businessman 10.6
room 10.5
together 10.5
bouquet 10.4
women 10.3
cheerful 9.7
establishment 9.6
married 9.6
smile 9.3
old 9
window 9
office 8.8
world 8.5
mother 8.5
building 8.4
fun 8.2
new 8.1
group 8.1
celebration 8
urban 7.9
day 7.8
sitting 7.7
house 7.5
one 7.5
silhouette 7.4
future 7.4
romantic 7.1
interior 7.1
indoors 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

window 95.8
posing 92.9
person 89.1
clothing 86.7
man 62.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-55
Gender Male, 52.2%
Disgusted 45.1%
Sad 48.3%
Confused 45.3%
Happy 45.2%
Angry 45.9%
Surprised 45.2%
Calm 49.9%

AWS Rekognition

Age 45-65
Gender Female, 50.2%
Disgusted 45.3%
Calm 49.2%
Angry 45.7%
Sad 48.9%
Surprised 45.3%
Happy 45.2%
Confused 45.3%

AWS Rekognition

Age 23-38
Gender Female, 53.4%
Angry 45.2%
Sad 49.7%
Disgusted 45.1%
Surprised 45.3%
Happy 45.4%
Calm 47.8%
Confused 46.4%

AWS Rekognition

Age 19-36
Gender Female, 50.8%
Happy 45.9%
Disgusted 45.3%
Angry 45.4%
Surprised 45.2%
Sad 47.7%
Calm 50.2%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Confused 45.4%
Calm 51.1%
Sad 46.7%
Surprised 45.4%
Angry 45.6%
Disgusted 45.3%
Happy 45.5%

Feature analysis

Amazon

Person 99.7%
Tie 60.2%

Categories