Human Generated Data

Title

Untitled (large group of children on trampolines)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17868

Human Generated Data

Title

Untitled (large group of children on trampolines)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.8
Human 98.8
Person 98.2
Person 97.8
Person 94
Person 93.7
Person 91.9
Person 91.9
Person 88.8
Person 86.9
Water 85.8
Person 85.5
Waterfront 82.1
Word 78.9
Person 78.2
Person 76.9
Port 76.3
Pier 76.3
Dock 76.3
Person 75.4
Person 75.3
Building 74.9
Person 74.2
Road 71.4
People 70.9
Urban 63.9
Outdoors 62.3
Architecture 61.1
Nature 60.2
Person 58.3
Town 57.6
City 57.6
Tarmac 56.5
Asphalt 56.5
Person 51.3
Person 47.2
Person 41.9

Imagga
created on 2022-02-26

ship 37.3
marina 36.4
city 27.5
liner 25.6
architecture 25.5
vessel 24.3
deck 23.4
building 23.2
travel 22.5
sea 21.9
warship 21.6
water 20.7
aircraft carrier 20.2
passenger ship 20
tourism 19.8
sky 19.8
urban 18.4
ocean 18.3
boat 17
harbor 16.4
cityscape 16.1
landscape 15.6
vacation 15.6
military vehicle 15.4
structure 14.6
vehicle 14
skyscraper 13.6
aerial 13.6
transportation 13.5
skyline 13.3
town 13
craft 12.8
boats 12.6
tower 12.5
port 12.5
bridge 12.3
buildings 12.3
bay 12.3
street 12
tourist 11.9
business 11.5
summer 10.9
outdoor 10.7
pier 10.5
shore 10.2
holiday 10
center 9.7
above 9.7
beach 9.6
hall 9.5
yacht 9.2
waterfront 9.2
road 9
coast 9
device 8.8
sail 8.8
office 8.7
panorama 8.6
line 8.6
traffic 8.6
famous 8.4
old 8.4
speed 8.3
lake 8.2
world 8.2
technology 8.2
landmark 8.1
river 8
dock 7.8
highway 7.7
downtown 7.7
clouds 7.6
marine 7.6
sign 7.5
sand 7.4
intersection 7.2
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 97.8
ship 94.9
outdoor 93
text 89.1
city 84.5
black 84.2
monochrome 82.9
white 66.3
street 54.9

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 69.7%
Calm 38.4%
Happy 38%
Sad 12.1%
Fear 3.5%
Confused 3.2%
Surprised 1.8%
Angry 1.5%
Disgusted 1.4%

AWS Rekognition

Age 14-22
Gender Male, 90.8%
Calm 68.2%
Angry 14.1%
Sad 10%
Fear 3.4%
Confused 1.4%
Happy 1.3%
Disgusted 1.1%
Surprised 0.5%

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people around a track 67.6%
a group of people standing in front of a building 67.5%
a group of people standing outside of a building 67.4%

Text analysis

Amazon

Sunclas