Human Generated Data

Title

Untitled (New York City?)

Date

c. 1935

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2473

Human Generated Data

Title

Untitled (New York City?)

People

Artist: Unidentified Artist,

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2473

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 99.9
Building 99.9
Factory 99.9
Person 98.2
Person 98.1
Person 97.9
Person 96.8
Person 96.7
Person 96.4
Person 93.7
Brewery 82.1
Outdoors 75.9
Clothing 61.9
Hat 61.9
Manufacturing 61.1
Nature 57.9
Worker 56.1
Brick 55.7

Clarifai
created on 2018-05-10

people 100
adult 99.5
group together 99.4
group 99
man 98.2
many 97.8
administration 94.9
one 93.4
barrel 93.1
several 92.8
two 92.2
woman 91.2
wear 90.4
drum 89.9
three 89.7
container 89.5
four 87.1
monochrome 85.1
watercraft 84.2
war 83.9

Imagga
created on 2023-10-06

container 63.9
vessel 34.8
steel drum 34.1
percussion instrument 33.4
bucket 32.6
bin 28.2
ashcan 26.8
musical instrument 25
barrel 17.5
can 15.1
old 13.2
winery 11.6
milk can 10.9
drum 10.7
home 10.4
garbage 9.8
house 9.2
clean 9.2
wood 9.2
environment 9
wooden 8.8
work 8.7
recycle 8.7
counter 8.6
wall 8.5
industry 8.5
basket 8.3
street 8.3
metal 8
urban 7.9
architecture 7.8
glass 7.8
waste 7.8
building 7.6
oil 7.4
brown 7.4
bottle 7.3
industrial 7.3
tank 7.2
dirty 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

building 99.8
outdoor 91.5
black 79.5
old 63.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Male, 97.5%
Sad 87.3%
Happy 33.2%
Calm 10.9%
Confused 9.4%
Surprised 7.8%
Fear 6.2%
Angry 1.5%
Disgusted 1.1%

AWS Rekognition

Age 14-22
Gender Male, 78.4%
Calm 61.5%
Fear 16.9%
Sad 12.9%
Surprised 7.1%
Disgusted 2.4%
Happy 1.6%
Angry 1.2%
Confused 0.7%

Feature analysis

Amazon

Person 98.2%
Hat 61.9%

Categories