Human Generated Data

Title

Untitled (nine figures standing on deck and stairs of house)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3613

Human Generated Data

Title

Untitled (nine figures standing on deck and stairs of house)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3613

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 98.8
Person 98.8
Person 98.5
Person 97.2
Person 97.1
Building 95.6
Housing 95.6
Person 93.1
Neighborhood 93
Urban 93
Person 91
Nature 89.6
Outdoors 82.2
House 77.3
Fence 76.6
Villa 70.9
Porch 66
Siding 63.1
Snow 59.7
Banister 55.9
Handrail 55.9

Clarifai
created on 2019-06-01

home 99.2
people 99.2
building 98.9
house 98.2
architecture 98.2
street 95.4
group 94.6
no person 93.8
family 93.7
town 93
monochrome 92
farmhouse 91.4
porch 91.4
bungalow 90.1
window 89.6
adult 89.1
wood 85.6
group together 85.5
tree 84.4
fence 83.9

Imagga
created on 2019-06-01

picket fence 89
fence 76.4
balcony 60.7
barrier 54.3
building 51
architecture 49.3
structure 48.5
house 39.4
obstruction 36.5
home 28.7
facade 28.4
city 20.8
window 20.6
travel 20.4
sky 19.8
tourism 17.3
estate 17.1
old 16.7
town 16.7
urban 16.6
residential 16.3
construction 16.3
landmark 15.4
door 15.2
exterior 14.8
history 14.3
residence 13.7
government 13.7
street 12.9
stone 12.7
roof 12.4
brick 12.3
wall 12
historic 11.9
mobile home 11.8
architectural 11.5
windows 11.5
real 11.4
housing 11.2
property 10.6
entrance 10.6
buildings 10.4
style 10.4
famous 10.2
column 10.2
clouds 10.1
wood 10
outdoor 9.9
columns 9.8
classical 9.6
historical 9.4
trailer 9.4
traditional 9.2
coast 9
new 8.9
politics 8.8
village 8.7
ancient 8.7
railing 8.6
holiday 8.6
capital 8.5
classic 8.4
vacation 8.2
mortgage 7.8
glass 7.8
museum 7.8
luxury 7.7
winter 7.7
square 7.2
night 7.1
wheeled vehicle 7.1
wooden 7
sea 7
country 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

building 99.3
house 97
outdoor 95.1
white 83.6
porch 80.4
black 72.7
window 69.6
old 67

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Sad 49.6%
Confused 49.5%
Disgusted 49.7%
Surprised 49.6%
Angry 49.7%
Happy 49.6%
Calm 49.8%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Confused 49.6%
Disgusted 49.6%
Happy 49.6%
Surprised 49.6%
Calm 49.6%
Sad 49.8%
Angry 49.9%

AWS Rekognition

Age 17-27
Gender Female, 50.4%
Happy 49.5%
Confused 49.5%
Disgusted 50.4%
Sad 49.5%
Calm 49.5%
Angry 49.6%
Surprised 49.5%

AWS Rekognition

Age 26-44
Gender Female, 50.4%
Disgusted 49.5%
Sad 49.7%
Happy 49.6%
Confused 49.5%
Surprised 49.5%
Angry 49.6%
Calm 50.1%

AWS Rekognition

Age 57-77
Gender Female, 50.2%
Happy 49.6%
Confused 49.6%
Angry 49.6%
Sad 49.8%
Calm 49.8%
Surprised 49.6%
Disgusted 49.6%

AWS Rekognition

Age 23-38
Gender Male, 50.3%
Surprised 49.5%
Calm 49.5%
Disgusted 50.5%
Confused 49.5%
Sad 49.5%
Happy 49.5%
Angry 49.5%

AWS Rekognition

Age 16-27
Gender Female, 50.1%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%
Calm 49.6%
Happy 49.5%
Angry 49.6%
Sad 50.2%

AWS Rekognition

Age 48-68
Gender Female, 50.4%
Sad 50.1%
Angry 49.6%
Happy 49.6%
Confused 49.6%
Calm 49.5%
Surprised 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 98.8%

Categories

Text analysis

Amazon

FF