Human Generated Data

Title

Untitled (large group of children on trampolines)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17867

Human Generated Data

Title

Untitled (large group of children on trampolines)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Water 98.9
Person 97.9
Human 97.9
Waterfront 97.6
Person 96.1
Person 95.1
Person 94.7
Pier 93.6
Dock 93.6
Port 93.6
Person 93.5
Person 92.6
Person 92.1
Nature 87.1
Person 84.9
Person 82.7
Person 79.3
Person 76.3
Person 76.2
Harbor 74.9
Person 74.1
Outdoors 70.9
Person 70.8
Marina 67.7
Vehicle 64.4
Transportation 64.4
Person 63.7
Person 60.2
Ship 59.5
Military 59.5
Navy 59.5
Cruiser 59.5
Vessel 58.8
Watercraft 58.8
Boat 55.8
Clothing 55.4
Shorts 55.4
Apparel 55.4
Person 46.9
Person 44

Imagga
created on 2022-02-26

intersection 40.4
ship 36.5
aircraft carrier 36
city 32.4
warship 32.3
military vehicle 24.4
sea 24.4
architecture 24.2
travel 24
sky 22.3
tourism 22.3
water 22
vessel 21.8
aerial 20.4
building 20.2
harbor 19.3
vehicle 19.1
urban 18.4
marina 17.9
cityscape 17
landscape 16.4
panorama 16.2
buildings 16.1
town 15.8
port 15.4
boat 15.3
ocean 14.9
boats 14.6
tower 14.3
tourist 13.6
structure 12.7
old 12.5
skyline 12.4
shore 12.1
clouds 11.8
landmark 11.7
transportation 11.7
beach 11
st 10.7
center 10.7
famous 10.2
street 10.1
history 9.8
river 9.8
skyscraper 9.6
marine 9.5
bay 9.4
church 9.3
craft 9.2
summer 9
vacation 9
coast 9
liner 8.8
construction 8.6
stone 8.6
sand 8.5
outdoor 8.4
shoreline 8.2
yacht 7.8
scene 7.8
downtown 7.7
traffic 7.6
capital 7.6
bridge 7.6
trip 7.5
waves 7.4
historic 7.3
island 7.3
road 7.2
scenery 7.2
world 7.1
passenger ship 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

ship 98.7
black and white 96.1
text 89.3
outdoor 85.1
sky 81.4
monochrome 78.5
city 76.1
black 66.4
white 61.3

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Female, 51.9%
Calm 81.3%
Sad 10%
Confused 2.9%
Angry 1.8%
Surprised 1.8%
Disgusted 1.4%
Fear 0.5%
Happy 0.4%

AWS Rekognition

Age 23-31
Gender Female, 77.4%
Calm 97.4%
Angry 0.9%
Sad 0.6%
Happy 0.4%
Surprised 0.3%
Fear 0.2%
Confused 0.1%
Disgusted 0.1%

Feature analysis

Amazon

Person 97.9%
Boat 55.8%

Captions

Microsoft

a vintage photo of a building 82.8%
a vintage photo of a train 50.3%
a vintage photo of a train station 50.2%

Text analysis

Amazon

KODAR
Suncina