Human Generated Data

Title

Untitled (people jumping on many outdoor trampolines)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14944

Human Generated Data

Title

Untitled (people jumping on many outdoor trampolines)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.2
Person 99.2
Person 98.8
Person 98.1
Person 96.3
Person 95.7
Person 93.5
Person 92.3
Person 90.3
Building 89
Architecture 87.9
Person 85.9
Person 79.3
Nature 76.6
Person 76.3
People 72.8
Urban 71
Person 69.8
City 69.5
Town 69.5
Waterfront 69.4
Water 69.4
Dome 67.5
Person 65.1
Person 64.6
Person 62.8
Person 62.1
Outdoors 60.2
Pier 59.8
Port 59.8
Dock 59.8
Arena 57.9
Amphitheater 57.9
Amphitheatre 57.9
Smoke 57.7
Road 56.7
Person 55.6
Person 53.1

Imagga
created on 2022-03-05

sky 28.9
travel 28.9
architecture 28.5
city 28.3
tourism 24.8
water 24.7
turbine 24.6
building 20.7
landscape 19.3
ocean 18.3
sea 18
structure 17.7
urban 15.7
park bench 15.6
bench 14.3
boat 13.9
marina 13.7
bridge 13.2
aerial 12.6
tower 12.5
vacation 12.3
intersection 12.2
street 12
seat 11.8
transportation 11.7
river 11.6
harbor 11.6
device 11.5
skyscraper 11.5
bay 11.3
old 11.2
tourist 10.9
landmark 10.8
snow 10.8
boats 10.7
patio 10.4
sundial 10.3
summer 10.3
shore 10.2
shoreline 10.2
area 10.2
clouds 10.1
lake 10.1
outdoor 9.9
park 9.9
beach 9.9
coast 9.9
highway 9.6
skyline 9.5
cityscape 9.5
buildings 9.5
day 9.4
road 9
history 8.9
stone 8.8
port 8.7
concrete 8.6
traffic 8.6
destination 8.4
famous 8.4
town 8.4
timepiece 8.2
furniture 8.2
scenic 7.9
scene 7.8
construction 7.7
culture 7.7
downtown 7.7
historic 7.3
island 7.3
scenery 7.2
fence 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.6
black and white 97.6
city 89.6
monochrome 89.4
outdoor 85.2
ship 81.6
sky 80.2
black 79.5
street 64.4
white 63.6
old 42.5
vintage 34

Face analysis

Amazon

AWS Rekognition

Age 12-20
Gender Female, 83.3%
Calm 33.2%
Fear 28.7%
Sad 15.8%
Disgusted 7.7%
Surprised 4.7%
Confused 4.4%
Happy 3.2%
Angry 2.3%

AWS Rekognition

Age 16-24
Gender Male, 91.4%
Calm 37.4%
Surprised 20.2%
Sad 16.7%
Happy 15.8%
Confused 3.5%
Angry 2.3%
Fear 2.1%
Disgusted 1.9%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a vintage photo of a person 85.1%
a vintage photo of a person 84.6%
a vintage photo of a train yard 58.1%

Text analysis

Amazon

Т37А°--АX
Concign