Human Generated Data

Title

Untitled (crowd standing in cemetery observing funeral)

Date

1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6452

Human Generated Data

Title

Untitled (crowd standing in cemetery observing funeral)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6452

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Human 99.8
Person 99.8
Person 99.6
Person 98.7
Person 97.8
Person 97
Nature 96.8
Person 96.7
Person 96.3
Outdoors 96
Person 92.5
Automobile 91.8
Transportation 91.8
Vehicle 91.8
Car 91.8
Person 90
Person 84.7
Land 80.7
Plant 80.5
Vegetation 80.3
Meal 71
Food 71
Tree 68.7
Person 68.7
Home Decor 68.3
Person 68.3
Countryside 65.6
Clothing 65.3
Apparel 65.3
Furniture 58.7
Couch 58.7
Grass 57.7
Woodland 57.3
Forest 57.3
Female 57
Night 56.9
Person 54.9

Clarifai
created on 2019-03-22

winter 99.1
snow 98.9
tree 97.3
frost 96.4
cold 95.5
no person 93.3
people 93.2
weather 91.3
frozen 90.8
light 87.7
cemetery 87.5
outdoors 87
frosty 86.8
house 85.7
dark 85.7
landscape 85.6
street 84.4
nature 84.3
wood 84.1
season 84

Imagga
created on 2019-03-22

mobile home 96.8
housing 82.1
structure 78.7
trailer 78.1
wheeled vehicle 58.6
vehicle 37.7
picket fence 37.3
fence 37.1
house 32.6
building 27
trees 24.9
barrier 22.4
sky 22.3
rural 21.1
landscape 20.8
tree 20
architecture 19.5
conveyance 19.4
wood 18.4
old 18.1
home 16
summer 15.4
winter 15.3
obstruction 15
grass 14.2
country 14.1
snow 13.4
farm 13.4
clouds 12.7
water 12.7
travel 12.7
barn 12.5
hut 12.4
scenic 12.3
season 11.7
outdoor 11.5
wooden 11.4
garden 11.1
sea 10.9
road 10.8
holiday 10.8
houses 10.7
sun 10.5
forest 10.4
scene 10.4
cold 10.3
scenery 9.9
buildings 9.5
light 9.4
field 9.2
countryside 9.1
coast 9
mist 8.7
sunny 8.6
beach 8.4
dark 8.4
ocean 8.3
park 8.2
morning 8.1
dwelling 8.1
night 8
seasonal 7.9
weather 7.8
fog 7.7
residential 7.7
outdoors 7.5
tourism 7.4
mountains 7.4
exterior 7.4
vacation 7.4
island 7.3
roof 7
agriculture 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

outdoor 98.1
black 79.2
white 76.7
night 76.7
black and white 71.4
monochrome 50.7
light 38.3
winter 19.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Calm 49.5%
Disgusted 49.5%
Confused 49.5%
Happy 49.5%
Sad 50.4%
Surprised 49.5%
Angry 49.6%

AWS Rekognition

Age 35-52
Gender Male, 50%
Happy 49.5%
Surprised 49.5%
Calm 50%
Angry 49.5%
Confused 49.5%
Sad 49.9%
Disgusted 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Sad 50.4%
Disgusted 49.5%
Calm 49.5%
Surprised 49.5%
Confused 49.5%
Angry 49.6%
Happy 49.5%

Feature analysis

Amazon

Person 99.8%
Car 91.8%

Text analysis

Google

McCOLLEYs
McCOLLEYs