Human Generated Data

Title

Untitled (family standing outside and in doorway of cabin)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6005

Human Generated Data

Title

Untitled (family standing outside and in doorway of cabin)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6005

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99
Person 99
Person 98.9
Nature 98.6
Outdoors 98.4
Person 97.7
Housing 96.9
Building 96.9
Person 96.8
Person 96.4
Person 96
Person 95.7
Person 95.5
Person 91.3
Countryside 89.8
House 89
Person 87.4
Home Decor 83.5
Rural 82.4
Hut 82.4
Person 80.3
Shack 78.7
Person 78.7
Cabin 75.6
Person 71.2
Shelter 70.8
Window 69.7
Person 67.9
Person 65.5
Porch 56.6

Clarifai
created on 2019-11-16

people 99.4
monochrome 98.8
winter 98
snow 97.8
window 97.8
no person 95.7
adult 94.6
vintage 94.6
house 94.5
home 94.4
man 94.2
group 93.4
farmhouse 92.3
vehicle 91.7
old 91.7
water 90.2
two 88.8
wood 88.3
storm 87.6
wear 87.1

Imagga
created on 2019-11-16

window 43.9
building 34.4
structure 32
snow 28.2
architecture 27.9
framework 23.4
window screen 22.7
old 22.3
winter 19.6
trees 19.6
screen 19.2
sky 17.9
supporting structure 17.7
wood 17.5
house 16.7
forest 16.5
landscape 16.4
fence 16.3
protective covering 15.9
cold 15.5
travel 15.5
wall 15.1
balcony 14.2
city 14.1
season 14
door 14
barrier 13.9
black 13.8
greenhouse 13.5
tree 13.3
grunge 12.8
texture 12.5
scene 12.1
home 12
windowsill 12
picket fence 11.6
vintage 11.6
holiday 11.5
urban 11.4
facade 11.1
frame 11
scenery 10.8
windows 10.6
covering 10.3
weather 10.2
exterior 10.1
street 10.1
light 10
park 10
sill 9.6
antique 9.5
sliding door 9.5
glass 9.3
stone 9.3
town 9.3
structural member 9.1
vacation 9
negative 8.7
frost 8.6
outdoors 8.2
paint 8.1
dirty 8.1
obstruction 8.1
natural 8
scenic 7.9
design 7.9
empty 7.7
modern 7.7
outdoor 7.6
frozen 7.6
weathered 7.6
grungy 7.6
brick 7.5
pattern 7.5
film 7.4
historic 7.3
art 7.2
history 7.2
night 7.1
country 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

tree 99.8
text 94.1
house 91.5
outdoor 87.4
building 83.1
old 62.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Female, 50.1%
Happy 49.5%
Fear 49.5%
Disgusted 49.7%
Angry 49.5%
Surprised 49.5%
Sad 50.2%
Confused 49.5%
Calm 49.5%

AWS Rekognition

Age 23-35
Gender Male, 50.5%
Confused 49.5%
Fear 49.5%
Angry 49.5%
Sad 49.5%
Disgusted 49.5%
Surprised 49.6%
Happy 49.5%
Calm 50.3%

AWS Rekognition

Age 30-46
Gender Male, 50.5%
Surprised 49.5%
Angry 49.5%
Happy 49.5%
Fear 49.5%
Calm 49.6%
Disgusted 49.5%
Sad 49.5%
Confused 50.4%

AWS Rekognition

Age 20-32
Gender Female, 50%
Surprised 49.6%
Calm 49.6%
Happy 49.6%
Angry 50%
Fear 49.5%
Sad 49.6%
Confused 49.5%
Disgusted 49.6%

AWS Rekognition

Age 35-51
Gender Male, 50.2%
Angry 49.7%
Happy 49.5%
Disgusted 49.5%
Sad 49.9%
Calm 49.5%
Surprised 49.5%
Confused 49.5%
Fear 49.9%

AWS Rekognition

Age 18-30
Gender Male, 50.4%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Calm 49.5%
Confused 50.5%
Fear 49.5%
Angry 49.5%

AWS Rekognition

Age 36-52
Gender Male, 50.5%
Surprised 49.5%
Sad 49.5%
Calm 50.2%
Happy 49.7%
Confused 49.5%
Disgusted 49.5%
Fear 49.5%
Angry 49.5%

AWS Rekognition

Age 46-64
Gender Male, 50.4%
Happy 49.5%
Fear 49.6%
Disgusted 49.5%
Angry 49.5%
Surprised 49.8%
Sad 49.6%
Confused 49.6%
Calm 49.9%

AWS Rekognition

Age 23-35
Gender Male, 50.3%
Disgusted 49.5%
Happy 49.7%
Angry 49.5%
Fear 49.5%
Calm 50.3%
Confused 49.5%
Surprised 49.5%
Sad 49.5%

Feature analysis

Amazon

Person 99%

Categories