Human Generated Data

Title

Untitled (Genest's Bread bread delivery men standing in front of fleet of delivery vehicles)

Date

c.1937

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4006

Human Generated Data

Title

Untitled (Genest's Bread bread delivery men standing in front of fleet of delivery vehicles)

People

Artist: Durette Studio, American 20th century

Date

c.1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4006

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.6
Person 99.6
Person 99.6
Person 99.5
Person 98.6
Person 98.5
Person 97.1
Nature 89.3
Wheel 82.1
Machine 82.1
Outdoors 78.7
Path 70.5
Building 67.9
Road 63.6
Transportation 62.9
Bus 62.3
Vehicle 62.3
Fence 61.9
Automobile 61.8
Car 61.8
Weather 61
Porch 55

Clarifai
created on 2019-06-01

people 98.5
monochrome 96
transportation system 95.5
no person 92
vehicle 91.2
street 90.8
war 89.3
many 86.7
military 85.4
water 84.6
railway 84.5
group 84
group together 83.4
beach 83.3
building 82.6
sea 82.4
road 82.1
adult 81.3
travel 81.3
city 81.2

Imagga
created on 2019-06-01

liner 82
ship 75.3
passenger ship 64.5
vessel 46.1
city 25.8
sky 25.5
craft 21.6
landscape 21.6
architecture 21.1
sea 19.6
boat 19.6
travel 19
snow 17.2
urban 16.6
aircraft carrier 15.6
water 15.4
building 15.2
warship 14.7
port 14.5
transportation 14.4
ocean 14.1
town 13.9
house 13.5
industry 12.8
ski slope 12.7
industrial 12.7
station 12.4
tourism 12.4
cityscape 12.3
slope 12.1
construction 12
dock 11.7
tower 11.6
panorama 11.4
structure 11.4
vehicle 11.3
winter 11.1
transport 11
river 10.7
shipping 10.4
military vehicle 9.9
vacation 9.8
cold 9.5
car 9.5
center 9.2
power 9.2
business 9.1
old 9.1
road 9
landmark 9
outdoors 9
steel 8.8
cruise 8.8
cargo 8.7
light 8.7
factory 8.7
harbor 8.7
pier 8.7
scene 8.7
pollution 8.7
bridge 8.6
facility 8.5
street 8.3
tourist 8.3
plant 8.2
coast 8.1
maritime 7.9
outside 7.7
geological formation 7.6
clouds 7.6
skyline 7.6
energy 7.6
buildings 7.6
historical 7.5
famous 7.4
environment 7.4
exterior 7.4
history 7.2
day 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

black and white 92.7
black 79.4
white 69
sky 51.5
old 41.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Sad 49.7%
Angry 49.6%
Disgusted 49.6%
Surprised 49.5%
Happy 49.6%
Calm 50%
Confused 49.6%

AWS Rekognition

Age 10-15
Gender Male, 50.2%
Disgusted 49.8%
Sad 49.9%
Happy 49.5%
Surprised 49.5%
Angry 49.6%
Calm 49.6%
Confused 49.7%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Calm 45.3%
Surprised 45.7%
Sad 46.8%
Confused 45.3%
Disgusted 45.9%
Happy 50.4%
Angry 45.6%

AWS Rekognition

Age 16-27
Gender Female, 50.1%
Disgusted 49.6%
Happy 49.6%
Sad 49.7%
Calm 49.7%
Angry 49.6%
Surprised 49.6%
Confused 49.6%

AWS Rekognition

Age 26-44
Gender Male, 50.3%
Disgusted 49.6%
Surprised 49.5%
Angry 49.7%
Confused 49.5%
Sad 49.6%
Calm 50%
Happy 49.6%

Feature analysis

Amazon

Person 99.6%
Wheel 82.1%
Bus 62.3%
Car 61.8%

Categories

Imagga

cars vehicles 99.7%

Captions

Text analysis

Amazon

B5-223