Human Generated Data

Title

Untitled (two photographs: crowd late in day; women wearing white on field)

Date

c. 1940, printed laterv

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6741

Human Generated Data

Title

Untitled (two photographs: crowd late in day; women wearing white on field)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940, printed laterv

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6741

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.5
Human 99.5
Person 99.3
Person 99
Person 98.7
Person 98.4
Person 97.7
Person 97.4
Person 97.4
Person 96.3
Person 96
Person 95.6
Person 94.1
Person 92.8
Person 92.8
Person 91.2
Apparel 89.2
Clothing 89.2
Person 86.8
Person 84.5
Outdoors 71.8
LCD Screen 71
Display 71
Electronics 71
Monitor 71
Screen 71
Pedestrian 70.3
Person 67.8
Person 65.6
Person 65.1
Person 64.2
Nature 62.9
People 61.7
Poster 56.5
Advertisement 56.5
Window 55.3

Clarifai
created on 2019-11-16

people 99.8
many 98.9
group 98.5
monochrome 97.9
group together 97.8
street 97.6
man 96.7
adult 95.5
woman 91.7
crowd 91.6
city 91
vehicle 90.7
wear 89.6
winter 86.9
transportation system 85.2
snow 85.1
airport 82.5
two 80.5
war 78.2
several 78.1

Imagga
created on 2019-11-16

snow 22
window 21.5
landscape 19.3
silhouette 18.2
picket fence 17.5
fence 15.9
barrier 15.2
people 14.5
black 14.4
man 14.1
water 14
newspaper 13.9
lake 13.8
grunge 13.6
sky 13.4
sea 13.3
scene 12.1
male 12
old 11.8
windowsill 11.8
pattern 11.6
outdoor 11.5
travel 11.3
art 11.1
winter 11.1
ocean 11
ski slope 10.8
product 10.7
trees 10.7
weather 10.5
negative 10.4
cold 10.3
structure 10.2
tree 10
park 9.9
obstruction 9.9
outdoors 9.9
framework 9.8
river 9.8
design 9.6
slope 9.5
sill 9.4
building 9.4
creation 9.4
holiday 9.3
door 9.2
relaxation 9.2
wood 9.2
fisherman 9.1
texture 9
reflection 9
vacation 9
forest 8.7
season 8.6
boat 8.5
frame 8.4
film 8.4
relax 8.4
summer 8.4
ice 8.3
peaceful 8.2
calm 8.2
alone 8.2
horizon 8.1
coast 8.1
sun 8
light 8
screen 8
support 7.8
world 7.5
leisure 7.5
sport 7.4
supporting structure 7.4
rough 7.3
digital 7.3
paint 7.2
dirty 7.2
material 7.1
sliding door 7.1
structural member 7.1
architecture 7
geological formation 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.5
person 89.2
black and white 89.1
man 72.5
white 63.2
water 60.4
clothing 58.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 50.5%
Confused 49.5%
Calm 49.9%
Surprised 49.5%
Fear 49.6%
Happy 49.5%
Angry 49.5%
Sad 49.9%
Disgusted 49.5%

AWS Rekognition

Age 46-64
Gender Male, 50.3%
Surprised 49.5%
Sad 49.7%
Confused 49.6%
Happy 49.5%
Disgusted 49.5%
Fear 50.1%
Angry 49.5%
Calm 49.5%

AWS Rekognition

Age 36-52
Gender Male, 50.4%
Confused 50.3%
Surprised 49.5%
Sad 49.5%
Calm 49.5%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Angry 49.5%

Feature analysis

Amazon

Person 99.5%