Human Generated Data

Title

Untitled (formal studio portrait with ten men and women in sitting and standing positions)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3725

Human Generated Data

Title

Untitled (formal studio portrait with ten men and women in sitting and standing positions)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3725

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 98.6
Person 98.6
Person 98.6
Person 97.6
Person 97.1
Person 96.9
Person 95.9
Person 95.3
Person 92.6
Person 91.6
Person 84
Nature 73.2
Face 72.7
Outdoors 71.8
Nurse 64
People 63.8
Sailor Suit 61.3
Portrait 60.1
Photography 60.1
Photo 60.1

Clarifai
created on 2019-06-01

people 99.7
group 98.7
adult 98.2
man 97.5
wear 96
group together 95.3
woman 93.9
several 92.8
retro 90.2
portrait 87.1
many 86.8
child 85.4
veil 83
five 80.8
outfit 77.9
four 76.5
art 75.8
leader 74.9
paper 74.2
vintage 73.9

Imagga
created on 2019-06-01

negative 70.2
film 55.9
photographic paper 42.5
photographic equipment 28.3
people 16.7
shower curtain 16.6
old 16
grunge 15.3
person 15.1
curtain 14.2
retro 13.9
vintage 13.2
art 12.7
sketch 12.3
drawing 11.4
design 11.2
color 11.1
texture 11.1
decoration 10.9
antique 10.4
portrait 10.3
graphic 10.2
furnishing 10
blind 9.9
face 9.9
adult 9.7
man 9.6
paper 9.5
love 9.5
happiness 9.4
happy 9.4
aged 9
human 9
business 8.5
silhouette 8.3
pattern 8.2
black 7.8
ancient 7.8
frame 7.7
style 7.4
protective covering 7.4
water 7.3
group 7.2
paint 7.2
representation 7.2
currency 7.2
religion 7.2
textured 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

wall 99.2
sketch 92.6
old 92.1
drawing 92
posing 91.4
person 89.7
text 85.9
human face 85
clothing 84.6
white 74.8
man 61.4
black and white 58.2
vintage 39.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Disgusted 45.4%
Confused 45.4%
Sad 48.1%
Happy 45.3%
Angry 45.5%
Surprised 45.5%
Calm 49.8%

AWS Rekognition

Age 20-38
Gender Male, 54.8%
Surprised 45.6%
Happy 45.5%
Disgusted 45.4%
Calm 51.4%
Sad 46%
Confused 45.7%
Angry 45.4%

AWS Rekognition

Age 26-43
Gender Male, 50.7%
Happy 46.6%
Disgusted 45.4%
Angry 45.3%
Calm 48.9%
Surprised 45.5%
Confused 45.6%
Sad 47.6%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Angry 47.1%
Surprised 45.6%
Calm 50.8%
Sad 45.9%
Confused 45.3%
Disgusted 45.2%
Happy 45.2%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Sad 45.5%
Angry 45.5%
Calm 52.4%
Confused 45.4%
Disgusted 45.2%
Happy 45.2%
Surprised 45.8%

AWS Rekognition

Age 26-43
Gender Male, 54.6%
Happy 47.6%
Confused 45.8%
Angry 45.7%
Disgusted 45.3%
Surprised 45.8%
Sad 47%
Calm 47.7%

AWS Rekognition

Age 26-43
Gender Male, 87.2%
Disgusted 2%
Sad 68.9%
Calm 18.2%
Confused 2.3%
Surprised 2.1%
Happy 3.1%
Angry 3.4%

AWS Rekognition

Age 26-43
Gender Female, 53.6%
Confused 45.4%
Surprised 45.3%
Angry 45.4%
Calm 45.6%
Disgusted 46%
Sad 51.7%
Happy 45.5%

AWS Rekognition

Age 20-38
Gender Male, 52%
Calm 47.9%
Surprised 45.4%
Sad 49.8%
Confused 45.7%
Disgusted 45.4%
Happy 45.4%
Angry 45.4%

Feature analysis

Amazon

Person 98.6%

Categories

Imagga

paintings art 95.7%
text visuals 4%