Human Generated Data

Title

Untitled (elevated view of clergymen entering church on town street)

Date

c. 1955-1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11070

Human Generated Data

Title

Untitled (elevated view of clergymen entering church on town street)

People

Artist: Claseman Studio, American 20th century

Date

c. 1955-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11070

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 98.4
Person 98.4
Person 97.6
Person 96.4
Person 95.5
Person 95.1
Person 94.6
Person 94.4
Person 94
Funeral 92
Person 91.2
Person 85.4
Person 82.3
Person 81.3
Military 79.7
Military Uniform 79.7
People 79
Transportation 76.3
Vehicle 76.3
Car 76.3
Automobile 76.3
Person 75
Crowd 72.5
Person 70
Car 69
Person 69
Person 67.7
Person 66.3
Armored 60.5
Army 60.5
Person 59.4
Furniture 58.8
Clothing 58.1
Apparel 58.1
Person 55.2
Person 46.6
Person 41.9

Clarifai
created on 2019-03-25

people 99.9
many 99.5
group together 99.2
group 99
adult 98.4
vehicle 97.8
man 95.9
military 95.2
war 94.8
transportation system 94.4
administration 93.9
crowd 90.8
home 90.3
wear 90
leader 89.7
soldier 86.6
street 85.1
watercraft 83.7
several 83.2
woman 81.2

Imagga
created on 2019-03-25

travel 28.9
city 26.6
tourism 24.8
landscape 24.5
stall 24.1
water 22.7
sky 21
dairy 20.7
architecture 19.5
sea 18.8
building 16.1
pen 15.5
river 15.1
vacation 14.7
coast 14.4
park 14
town 13.9
structure 13.6
mountain 13.3
plow 13.3
cityscape 13.2
scenic 13.2
urban 13.1
rock 13
enclosure 13
summer 12.9
stone 12.1
shore 12.1
tourist 12
rural 11.5
famous 11.2
old 11.1
lake 11
tree 10.8
outdoor 10.7
houses 10.7
panorama 10.5
buildings 10.4
scene 10.4
beach 10.2
house 10
ocean 10
tool 9.9
history 9.8
aerial 9.7
clouds 9.3
boat 9.3
church 9.2
industrial 9.1
scenery 9
country 8.8
panoramic 8.6
coastline 8.5
hill 8.4
mountains 8.3
landmark 8.1
farm 8
agriculture 7.9
spring 7.8
day 7.8
sunny 7.7
outside 7.7
construction 7.7
england 7.6
capital 7.6
bridge 7.6
outdoors 7.5
destination 7.5
group 7.3
seaside 7.2
horizon 7.2

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

outdoor 98.4
people 84.6
white 61.3
crowd 30.5
black and white 30.5
street 5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 57-77
Gender Female, 51.4%
Sad 47%
Surprised 45.5%
Calm 50.5%
Confused 45.4%
Happy 45.8%
Disgusted 45.4%
Angry 45.4%

AWS Rekognition

Age 12-22
Gender Female, 50.4%
Happy 49.5%
Sad 49.6%
Angry 49.6%
Disgusted 49.9%
Surprised 49.5%
Calm 49.8%
Confused 49.5%

AWS Rekognition

Age 45-65
Gender Female, 50.4%
Happy 49.7%
Calm 49.6%
Sad 50.1%
Angry 49.6%
Confused 49.5%
Surprised 49.5%
Disgusted 49.6%

AWS Rekognition

Age 48-68
Gender Female, 50.3%
Disgusted 49.6%
Surprised 49.5%
Calm 49.8%
Happy 49.6%
Sad 49.9%
Confused 49.6%
Angry 49.6%

AWS Rekognition

Age 14-25
Gender Female, 50.2%
Calm 49.7%
Disgusted 49.5%
Sad 50%
Confused 49.6%
Happy 49.6%
Surprised 49.5%
Angry 49.6%

AWS Rekognition

Age 16-27
Gender Female, 50.5%
Angry 49.5%
Happy 49.7%
Sad 49.6%
Confused 49.7%
Disgusted 49.5%
Calm 49.8%
Surprised 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Angry 49.7%
Surprised 49.6%
Sad 49.8%
Calm 49.6%
Confused 49.6%
Happy 49.6%
Disgusted 49.7%

AWS Rekognition

Age 4-9
Gender Male, 50.4%
Calm 49.9%
Sad 49.6%
Confused 49.5%
Disgusted 49.7%
Angry 49.6%
Happy 49.6%
Surprised 49.6%

AWS Rekognition

Age 48-68
Gender Male, 50.3%
Confused 49.6%
Angry 49.6%
Surprised 49.5%
Happy 49.5%
Calm 49.9%
Disgusted 49.5%
Sad 49.9%

AWS Rekognition

Age 23-38
Gender Male, 50.4%
Surprised 49.5%
Angry 49.7%
Sad 50%
Calm 49.6%
Happy 49.5%
Confused 49.6%
Disgusted 49.5%

AWS Rekognition

Age 12-22
Gender Female, 50.1%
Sad 50.1%
Disgusted 49.5%
Happy 49.5%
Calm 49.6%
Angry 49.6%
Confused 49.6%
Surprised 49.6%

AWS Rekognition

Age 27-44
Gender Female, 50.7%
Confused 45.1%
Happy 45.1%
Sad 54.1%
Surprised 45.1%
Angry 45.2%
Disgusted 45.1%
Calm 45.2%

AWS Rekognition

Age 35-52
Gender Male, 50.2%
Angry 49.5%
Calm 49.5%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 50.4%

AWS Rekognition

Age 23-38
Gender Male, 50.2%
Surprised 49.5%
Disgusted 49.5%
Calm 50%
Sad 49.9%
Confused 49.5%
Happy 49.5%
Angry 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 50.3%
Confused 49.6%
Calm 49.6%
Happy 49.5%

AWS Rekognition

Age 12-22
Gender Male, 50%
Calm 50%
Disgusted 49.5%
Sad 49.8%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Angry 49.5%

AWS Rekognition

Age 17-27
Gender Male, 50.2%
Sad 49.9%
Disgusted 49.5%
Happy 49.5%
Calm 49.9%
Angry 49.6%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 48-68
Gender Female, 50.2%
Confused 49.5%
Surprised 49.5%
Sad 50.3%
Disgusted 49.5%
Happy 49.7%
Calm 49.5%
Angry 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Confused 49.5%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%
Calm 49.6%
Happy 49.6%
Sad 50.1%

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Happy 49.5%
Sad 49.6%
Calm 50.1%
Angry 49.6%
Confused 49.6%
Disgusted 49.6%
Surprised 49.5%

Feature analysis

Amazon

Person 98.4%
Car 76.3%

Text analysis

Amazon

s