Human Generated Data

Title

Untitled (clergymen parading down residential street)

Date

c. 1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11035

Human Generated Data

Title

Untitled (clergymen parading down residential street)

People

Artist: Claseman Studio, American 20th century

Date

c. 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Person 99.8
Human 99.8
Person 99.4
Person 99.3
Person 98.2
Person 97.9
Person 97.3
Automobile 87.2
Car 87.2
Transportation 87.2
Vehicle 87.2
Person 83.7
Pedestrian 80.8
Crowd 77.4
Funeral 76.4
Person 76
Military 75.1
Military Uniform 73.5
Officer 72
Person 71.8
People 68.2
Person 63.8
Marching 59.8
Tarmac 59.1
Asphalt 59.1
Clothing 55.3
Apparel 55.3
Person 46.7

Clarifai
created on 2019-03-25

people 99.7
group together 99
street 98.9
adult 96.4
group 96
monochrome 95.9
man 95.4
many 94.4
road 93.2
woman 92.4
administration 90.9
war 90.1
military 87.2
transportation system 87
vehicle 84.8
crowd 84.2
child 83.5
soldier 81.9
leader 80.4
police 78.3

Imagga
created on 2019-03-25

swing 46.2
mechanical device 36.6
plaything 34.2
mechanism 27.2
park bench 20.7
park 18.4
bench 18.3
trees 16.9
light 16
black 15.6
seat 14.6
city 14.1
travel 14.1
outdoor 13.8
sun 12.9
summer 12.9
tree 12.6
silhouette 12.4
forest 12.2
landscape 11.9
conveyance 11.6
sky 11.5
sport 11
dark 10.9
urban 10.5
old 10.4
furniture 10.1
sunset 9.9
transportation 9.9
wheeled vehicle 9.7
scene 9.5
architecture 9.4
evening 9.3
outdoors 9.1
track 9.1
night 8.9
autumn 8.8
man 8.7
ocean 8.3
vacation 8.2
road 8.1
scenery 8.1
recreation 8.1
scenic 7.9
day 7.8
people 7.8
sunny 7.7
building 7.7
winter 7.7
beach 7.6
relax 7.6
fun 7.5
leisure 7.5
transport 7.3
morning 7.2
barrier 7.2
river 7.1
structure 7.1
sea 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

outdoor 99.3
road 97.1
white 78.7
black 78.2
way 52.6
black and white 52.6
street 48.1
man 7.9
monochrome 7.5
people 6.4

Face analysis

Amazon

AWS Rekognition

Age 15-25
Gender Female, 50.5%
Happy 49.7%
Calm 49.5%
Sad 50%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Surprised 49.6%
Angry 49.6%
Disgusted 49.7%
Happy 49.6%
Sad 49.6%
Confused 49.6%
Calm 49.8%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Calm 49.5%
Disgusted 50.4%
Sad 49.5%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Angry 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Happy 49.5%
Disgusted 49.6%
Calm 49.6%
Sad 50.2%
Surprised 49.5%
Confused 49.5%
Angry 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Sad 49.6%
Angry 49.7%
Calm 49.6%
Surprised 49.7%
Happy 49.6%
Confused 49.6%
Disgusted 49.7%

AWS Rekognition

Age 35-52
Gender Male, 50%
Happy 49.6%
Disgusted 49.6%
Sad 49.8%
Angry 49.6%
Confused 49.8%
Surprised 49.6%
Calm 49.7%

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Confused 49.6%
Calm 49.8%
Happy 49.6%
Disgusted 49.6%
Surprised 49.6%
Sad 49.8%
Angry 49.6%

AWS Rekognition

Age 2-5
Gender Female, 50.3%
Sad 50.2%
Happy 49.6%
Angry 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.6%
Surprised 49.5%

AWS Rekognition

Age 57-77
Gender Female, 50.1%
Surprised 49.5%
Angry 49.5%
Disgusted 50.4%
Calm 49.5%
Sad 49.5%
Confused 49.5%
Happy 49.5%

Feature analysis

Amazon

Person 99.8%
Car 87.2%

Captions

Microsoft

a black and white photo of a street 93.8%
a black and white photo of people on a street 93.5%
a black and white photo of a person 89.4%

Text analysis

Amazon

XAGOX
MJ13--Y137