Human Generated Data

Title

Untitled (group of men singing on stadium field during large Catholic event)

Date

1955-1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11011

Human Generated Data

Title

Untitled (group of men singing on stadium field during large Catholic event)

People

Artist: Claseman Studio, American 20th century

Date

1955-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11011

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99.5
Person 99.5
Person 99.3
Person 97.4
Crowd 89
Audience 86.1
Person 83.6
Field 83.1
People 82.9
Person 82.4
Clothing 81.5
Apparel 81.5
Person 75.4
Building 70.8
Cricket 57.1
Sports 57.1
Sport 57.1
Person 56.6

Clarifai
created on 2019-03-25

people 99.7
adult 98.3
many 97.8
group together 97.5
cemetery 97.2
man 96.6
group 96.4
war 94.8
crowd 92.9
funeral 92.4
monochrome 92.2
grave 92.1
woman 90.3
military 86.3
street 85.8
leader 85.7
soldier 85.5
tombstone 84
memory 83.6
one 82.8

Imagga
created on 2019-03-25

cemetery 49
apiary 45.9
shed 37.6
building 27.8
outbuilding 27.5
landscape 22.3
tree 19.6
rural 19.4
ashcan 19.3
house 18.4
container 17.6
structure 17.5
grass 17.4
gravestone 15.7
fence 15.5
bin 15.4
old 15.3
memorial 14.1
architecture 14.1
park 14
stone 13.8
park bench 13.7
city 13.3
bench 12.8
sky 12.7
forest 12.2
trees 11.6
picket fence 10.8
travel 10.6
country 10.5
garden 10.2
countryside 10
outdoor 9.9
summer 9.6
village 9.6
home 9.6
town 9.3
field 9.2
seat 9
autumn 8.8
outdoors 8.3
scenery 8.1
scenic 7.9
agriculture 7.9
urban 7.9
spring 7.8
clouds 7.6
barrier 7.6
wood 7.5
street 7.4
water 7.3
new 7.3
sun 7.2
road 7.2
farm 7.1
mailbox 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

grass 99.5
outdoor 98.5
black 67.2
person 67.2
cemetery 37.9
black and white 32.3
assembly 18.1
wedding 17.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50.1%
Sad 49.8%
Surprised 49.6%
Calm 49.7%
Confused 49.5%
Happy 49.6%
Disgusted 49.7%
Angry 49.6%

AWS Rekognition

Age 16-27
Gender Female, 50%
Happy 49.6%
Surprised 49.5%
Disgusted 49.5%
Calm 49.6%
Sad 50.2%
Confused 49.5%
Angry 49.5%

AWS Rekognition

Age 20-38
Gender Male, 50.1%
Confused 49.5%
Sad 49.5%
Calm 50.5%
Happy 49.5%
Surprised 49.5%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 17-27
Gender Male, 50.2%
Angry 49.5%
Surprised 49.5%
Sad 49.9%
Calm 50%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Happy 49.6%
Sad 49.5%
Calm 49.5%
Angry 49.5%
Confused 49.5%
Disgusted 50.3%
Surprised 49.5%

Feature analysis

Amazon

Person 99.5%