Human Generated Data

Title

Untitled (members of wedding party posed outdoors next to decorated car)

Date

1947

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6503

Human Generated Data

Title

Untitled (members of wedding party posed outdoors next to decorated car)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6503

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Person 99
Human 99
Car 98.8
Automobile 98.8
Vehicle 98.8
Transportation 98.8
Person 97.6
Fence 93.5
Car 92.9
Person 92.7
Person 91.7
Person 91.2
Person 80.4
Picket 69.7
Building 68
People 64.9
Urban 64.6
Person 64.5
Railing 57.7
Outdoors 56.4
Road 56.2
Neighborhood 55.9

Clarifai
created on 2019-03-22

people 99.7
group together 98.4
group 97.3
adult 96.9
man 96
monochrome 95.9
street 94.1
many 92.1
woman 91.8
child 89.9
two 87.6
war 87.1
vehicle 86.7
one 85.1
fence 85
military 84.3
transportation system 80.7
black and white 77.7
three 77.5
several 77

Imagga
created on 2019-03-22

picket fence 100
fence 100
barrier 78.4
obstruction 51.8
structure 34.2
sky 22.3
building 19.1
pier 19
water 18.7
city 16.6
architecture 16.4
ocean 14.1
travel 14.1
night 13.3
sea 13.3
industrial 12.7
beach 12.6
landscape 12.6
bridge 12.3
support 11.8
house 11.7
ship 11.5
urban 11.4
tourism 10.7
vacation 10.6
pollution 10.6
sand 10.5
industry 10.2
clouds 10.1
street 10.1
liner 10.1
tower 9.9
factory 9.7
device 9.6
grass 9.5
winter 9.4
light 9.4
power 9.2
passenger ship 9.1
road 9
landmark 9
vessel 9
plant 9
reflection 8.9
river 8.9
outdoor 8.4
energy 8.4
summer 8.4
town 8.3
scenery 8.1
steel 8.1
chimney 7.8
station 7.7
railing 7.7
construction 7.7
tree 7.7
old 7.7
electricity 7.6
wood 7.5
boat 7.4
tourist 7.3
home 7.2

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

outdoor 89.5
black 73
white 68.3
black and white 68.3
monochrome 32.3
street 18.6
infrared 12.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Angry 49.6%
Surprised 49.6%
Happy 49.5%
Disgusted 49.5%
Sad 50.1%
Confused 49.5%
Calm 49.7%

AWS Rekognition

Age 57-77
Gender Male, 52.4%
Calm 46.2%
Confused 45.1%
Angry 45.1%
Sad 53.3%
Surprised 45.1%
Disgusted 45.1%
Happy 45.1%

AWS Rekognition

Age 45-63
Gender Male, 50.3%
Calm 50.3%
Disgusted 49.5%
Confused 49.5%
Happy 49.5%
Sad 49.6%
Surprised 49.5%
Angry 49.5%

AWS Rekognition

Age 48-68
Gender Male, 50.4%
Calm 46.6%
Surprised 45.8%
Happy 46.5%
Sad 45.9%
Disgusted 48%
Angry 46.3%
Confused 45.9%

AWS Rekognition

Age 35-55
Gender Male, 50.2%
Disgusted 49.5%
Calm 49.7%
Angry 49.6%
Sad 50.1%
Happy 49.6%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 23-38
Gender Male, 54.6%
Surprised 45.1%
Sad 47.3%
Confused 45.4%
Disgusted 45.1%
Calm 51.7%
Happy 45%
Angry 45.3%

Feature analysis

Amazon

Person 99%
Car 98.8%

Categories

Imagga

cars vehicles 99.6%

Text analysis

Amazon

82204