Human Generated Data

Title

Untitled (man standing next to snow sculpture)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1475

Human Generated Data

Title

Untitled (man standing next to snow sculpture)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1475

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 97.3
Person 97.3
Clothing 96.9
Apparel 96.9
Person 96.8
Standing 75.9
Art 71.9
Photo 69.6
Photography 69.6
Face 67.4
Portrait 67.4
Nature 66.5
Female 62.3
Wall 59.2
Sculpture 58.8
Shorts 58.2
Furniture 57.4
Chair 57.4
Outdoors 57.4
Path 55.6

Clarifai
created on 2019-06-01

people 99.8
man 98.9
monochrome 98.1
adult 97.5
two 96.2
woman 93.7
group 89.4
home 88.7
street 87.4
winter 86.9
group together 86.6
child 86.2
nature 84.8
one 84.6
water 84.4
wear 80.9
snow 80.4
boy 76.4
portrait 75.2
family 74.6

Imagga
created on 2019-06-01

picket fence 37.8
fence 36.7
barrier 25.5
building 19.8
man 18.1
structure 17.9
obstruction 17
people 16.2
architecture 15.9
male 14.9
travel 14.8
black 12.6
old 12.5
person 12.3
outdoor 12.2
groom 12.1
snow 11.6
adult 11.1
winter 11.1
landscape 10.4
cold 10.3
house 10.1
cleaner 10
dress 9.9
silhouette 9.9
park 9.9
outdoors 9.7
couple 9.6
men 9.4
outside 9.4
two 9.3
city 9.1
bride 9
light 8.7
love 8.7
water 8.7
business 8.5
street 8.3
landmark 8.1
art 8.1
women 7.9
urban 7.9
day 7.8
modern 7.7
tourism 7.4
vacation 7.4
wedding 7.4
statue 7.2
religion 7.2
river 7.1
trees 7.1
column 7.1
sky 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

black and white 93.5
man 90.4
clothing 74.8
person 74
fog 59.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 51.9%
Disgusted 45.3%
Calm 49.7%
Angry 45.3%
Sad 48.2%
Happy 46%
Surprised 45.3%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.7%
Confused 45.8%
Surprised 45.5%
Calm 50.3%
Sad 46.8%
Happy 45.4%
Disgusted 45.7%
Angry 45.5%

Feature analysis

Amazon

Person 97.3%

Text analysis

Amazon

702HF1