Human Generated Data

Title

Untitled (two women standing in boat with matching aprons and hats)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6004

Human Generated Data

Title

Untitled (two women standing in boat with matching aprons and hats)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6004

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.1
Person 99.1
Person 98.7
Person 98.6
Person 98.6
Person 97.8
Person 97
Person 95.9
Boat 95.3
Vehicle 95.3
Transportation 95.3
Watercraft 81.5
Vessel 81.5
Rowboat 74.5
Person 72.3
Camping 65.5

Clarifai
created on 2019-11-16

people 99.6
vehicle 99.3
watercraft 98.5
water 96.9
transportation system 96.7
adult 94.1
monochrome 93.5
river 92.5
group 92.2
art 91.9
boat 91.2
military 88.5
man 88.3
war 87.8
cavalry 86.4
aircraft 86.1
winter 86
wear 85.8
snow 85.7
lake 84.5

Imagga
created on 2019-11-16

structure 20.7
water 20.7
snow 20.3
silhouette 17.4
landscape 16.4
fountain 16
old 15.3
building 14.6
window 14.2
sky 14
river 13.3
black 13.2
winter 12.8
tree 12.7
sunset 12.6
forest 12.2
lake 11.9
weather 11.6
ocean 11.5
man 11.4
outdoors 11.3
device 11.2
cold 11.2
people 11.1
architecture 11
light 10.7
travel 10.6
frost 10.5
dark 10
trees 9.8
windowsill 9.8
sea 9.4
art 9.1
dirty 9
transportation 9
pond 8.7
scene 8.6
dusk 8.6
boat 8.5
grunge 8.5
male 8.5
wood 8.3
city 8.3
vessel 8.2
scenery 8.1
equipment 8
sill 7.8
season 7.8
sun 7.7
outdoor 7.6
frozen 7.6
person 7.6
pattern 7.5
reflection 7.5
house 7.5
frame 7.5
tourism 7.4
park 7.4
environment 7.4
ice 7.4
beach 7.4
vacation 7.4
street 7.4
alone 7.3
morning 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.1
window 91.5
water 90.2
person 90
lake 84.4
clothing 84
tree 77.7
picture frame 7.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 50.4%
Angry 49.5%
Happy 49.5%
Disgusted 50.5%
Calm 49.5%
Fear 49.5%
Surprised 49.5%
Sad 49.5%
Confused 49.5%

AWS Rekognition

Age 16-28
Gender Female, 50.1%
Surprised 49.6%
Sad 49.5%
Fear 49.5%
Angry 49.5%
Happy 49.6%
Disgusted 49.5%
Confused 49.5%
Calm 50.3%

AWS Rekognition

Age 35-51
Gender Male, 50.4%
Fear 49.5%
Surprised 49.6%
Angry 49.5%
Calm 50%
Sad 49.5%
Happy 49.7%
Disgusted 49.5%
Confused 49.6%

AWS Rekognition

Age 33-49
Gender Male, 50%
Sad 49.5%
Disgusted 49.5%
Confused 49.7%
Surprised 49.7%
Angry 49.6%
Calm 49.6%
Happy 49.7%
Fear 49.7%

AWS Rekognition

Age 34-50
Gender Male, 50.4%
Fear 49.5%
Confused 49.5%
Disgusted 49.5%
Angry 49.5%
Sad 49.5%
Calm 50.4%
Happy 49.6%
Surprised 49.5%

AWS Rekognition

Age 16-28
Gender Female, 50%
Disgusted 49.5%
Happy 50.4%
Angry 49.5%
Fear 49.5%
Calm 49.5%
Sad 49.5%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 13-23
Gender Male, 50.3%
Angry 49.5%
Disgusted 49.5%
Fear 49.8%
Sad 49.6%
Happy 49.5%
Calm 49.6%
Confused 49.8%
Surprised 49.6%

Feature analysis

Amazon

Person 99.1%
Boat 95.3%

Categories