Human Generated Data

Title

Untitled (Buckeye Lake Amusement Park, near Columbus, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.626

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Buckeye Lake Amusement Park, near Columbus, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.626

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.7
Human 99.7
Person 99.4
Person 99.4
Person 99.1
Person 98.7
Pedestrian 88.6
Person 86.7
Apparel 86.2
Clothing 86.2
Shorts 85.8
Person 74.6
Text 74.6
People 71.6
Footwear 65.9
Shoe 65.9
Crowd 63.1
Shoe 60.5
Symbol 59.2
Transportation 55.6
Vehicle 55.6

Clarifai
created on 2018-03-23

people 99.9
adult 98.6
group 97
group together 96.2
man 95.9
administration 95
street 94.2
woman 93.4
many 92.9
one 91.9
wear 89.3
monochrome 87.9
child 87.2
several 87.1
two 86.1
education 85.4
war 84.7
police 83.7
offense 83.1
leader 82.7

Imagga
created on 2018-03-23

book jacket 79
jacket 61.5
barbershop 54.4
shop 48
wrapping 46.7
mercantile establishment 34.3
covering 31.6
place of business 22.9
sign 21.8
old 21.6
door 20.1
blackboard 19.5
building 17.1
wall 17.1
texture 16
aged 13.6
vintage 13.2
antique 13
decoration 12.9
house 12.5
blank 12
window 11.9
frame 11.7
city 11.6
establishment 11.5
black 11.4
graffito 11
architecture 10.9
retro 10.7
entrance 10.6
ancient 10.4
empty 10.3
grunge 10.2
design 9.6
exterior 9.2
business 9.1
board 9
structure 9
chalkboard 8.8
urban 8.7
street 8.3
note 8.3
pattern 8.2
road 8.1
history 8.1
home 8
text 7.9
paper 7.8
chalk 7.8
travel 7.7
outdoor 7.6
decorative 7.5
message 7.3
border 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

outdoor 99.9
building 99.6
person 87.2
sign 73.9
people 56

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Female, 50.5%
Sad 46.1%
Surprised 45.2%
Confused 45.3%
Happy 45.1%
Calm 52.9%
Disgusted 45.1%
Angry 45.4%

AWS Rekognition

Age 16-27
Gender Male, 50%
Confused 45.2%
Sad 46.1%
Calm 51.6%
Disgusted 45.7%
Happy 45.1%
Surprised 45.6%
Angry 45.8%

AWS Rekognition

Age 27-44
Gender Female, 50.2%
Disgusted 49.7%
Surprised 49.6%
Sad 49.9%
Calm 49.6%
Angry 49.6%
Happy 49.6%
Confused 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Happy 49.5%
Confused 49.5%
Angry 49.6%
Disgusted 49.6%
Sad 49.8%
Calm 49.9%
Surprised 49.6%

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 65.9%

Text analysis

Amazon

HALL
RINK
HOTOS
PALMIST
3 HALL
3
CAMEL CONN
SAT
PALMST
TNTSSES
HOTOS nt
1os
NENIo
nt
2287

Google

ATIO AL MIST OTOS
AL
OTOS
ATIO
MIST