Human Generated Data

Title

Untitled (two photographs: crowd late in day; parking lot with Ladies Rest Room sign)

Date

c. 1940, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6740

Human Generated Data

Title

Untitled (two photographs: crowd late in day; parking lot with Ladies Rest Room sign)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6740

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 98.8
Person 98.5
Person 97.9
Person 97.8
Person 97.7
Collage 97.2
Advertisement 97.2
Poster 97.2
Person 97.2
Person 97.2
Person 97.2
Person 97
Person 96.8
Person 96.8
Person 95.7
Person 94.1
Wheel 92.9
Machine 92.9
Person 92.2
Person 89.2
Automobile 81
Vehicle 81
Transportation 81
Car 81
Wheel 80.5
Person 80.2
Person 70.6
Pedestrian 69.2
Apparel 67.7
Clothing 67.7
Person 67
Car 65.8
Wheel 65.4
Urban 62.2
Person 60.3
LCD Screen 58.9
Display 58.9
Monitor 58.9
Electronics 58.9
Screen 58.9
Overcoat 58.3
Coat 58.3
Person 48.2

Clarifai
created on 2019-11-16

people 99.9
street 99.5
monochrome 98.9
many 98.7
group together 98.3
group 97.5
man 96.3
child 94.8
crowd 92.4
woman 92.1
adult 91.7
transportation system 90.4
vehicle 88.5
city 88.5
spectator 85.7
analogue 85.3
movie 85.3
airport 85
bike 84.8
light 83.2

Imagga
created on 2019-11-16

window 22.9
shop 20.2
black 17.4
man 16.8
mercantile establishment 16
light 15.4
building 15.2
architecture 13.3
television 13
barbershop 12.6
aquarium 12.5
travel 12
art 11.7
glass 11.7
business 11.5
urban 11.3
case 11.1
people 11.1
old 11.1
industry 11.1
place of business 10.8
interior 10.6
grunge 10.2
design 10.1
city 10
shelf 9.9
sky 9.6
men 9.4
chair 9.4
modern 9.1
industrial 9.1
landscape 8.9
technology 8.9
equipment 8.8
construction 8.5
male 8.5
dirty 8.1
metal 8
detail 8
door 7.9
work 7.9
scene 7.8
station 7.7
wall 7.7
texture 7.6
journey 7.5
house 7.5
silhouette 7.4
vintage 7.4
inside 7.4
music 7.2
transportation 7.2
night 7.1
steel 7.1
working 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 96.9
text 91
person 88.5
street 81.9
monochrome 80
man 67.2
people 64.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Male, 53.9%
Confused 45.7%
Surprised 45%
Disgusted 45%
Happy 45%
Calm 45.5%
Sad 53.5%
Fear 45.2%
Angry 45.1%

AWS Rekognition

Age 28-44
Gender Male, 50.4%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Sad 49.5%
Fear 50.4%
Calm 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 33-49
Gender Male, 50.2%
Happy 49.5%
Fear 49.5%
Disgusted 49.5%
Angry 49.6%
Surprised 49.5%
Sad 50.1%
Confused 49.5%
Calm 49.8%

Feature analysis

Amazon

Person 99.4%
Wheel 92.9%
Car 81%

Categories