Human Generated Data

Title

Untitled (Cathedral high school ship parade float)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4381

Human Generated Data

Title

Untitled (Cathedral high school ship parade float)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4381

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 98.9
Person 98.9
Person 98.3
Person 98.1
Person 96.9
Person 94.4
Person 94.2
Person 91.3
Person 86.8
Apparel 84.4
Clothing 84.4
Art 83.3
Drawing 83.3
Person 82.4
Transportation 77.2
Boat 77.2
Vehicle 77.2
Sketch 66.8
Shorts 66.7
People 65.1
Person 64.3
Person 63.9
Person 62.8
Amusement Park 57.2
Theme Park 57.2
Person 44.1

Clarifai
created on 2019-06-01

people 98.9
furniture 95.9
adult 95.5
group 95.5
monochrome 94.3
man 93.2
room 92.4
woman 92
chair 88.1
home 86.3
illustration 85.7
street 84.4
group together 84.2
many 83.8
indoors 83.7
vehicle 83.4
print 82.7
seat 79.4
art 78.5
wear 78.3

Imagga
created on 2019-06-01

sketch 100
drawing 79.4
representation 61.3
architecture 28.4
city 24.9
snow 22.2
winter 19.6
building 19.3
urban 18.3
ice 16.4
people 16.2
travel 15.5
business 15.2
art 14.8
house 14.3
history 13.4
scene 13
old 11.8
outdoors 11.2
cold 11.2
trees 10.7
crowd 10.6
reflection 10.6
group 10.5
walking 10.4
construction 10.3
exterior 10.1
design 10.1
negative 10.1
tree 10
silhouette 9.9
column 9.9
park 9.9
marble 9.8
frozen 9.5
ancient 9.5
famous 9.3
window 9.3
sculpture 9.2
interior 8.8
man 8.7
move 8.6
walk 8.6
wall 8.6
historical 8.5
stone 8.4
floor 8.4
landmark 8.1
office 8
water 8
white 7.9
station 7.8
snowy 7.8
sky 7.7
outdoor 7.6
hall 7.6
finance 7.6
weather 7.5
tourism 7.4
glass 7.4
speed 7.3
home 7.3
tourist 7.2
transportation 7.2
film 7.2
rural 7
seasonal 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

drawing 96.3
sketch 89.7
window 84.7
white 68.9
black and white 61.9
old 59.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 50.3%
Disgusted 50%
Confused 49.6%
Surprised 49.5%
Calm 49.7%
Happy 49.5%
Angry 49.6%
Sad 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Happy 49.6%
Disgusted 49.5%
Sad 49.6%
Surprised 49.6%
Angry 49.5%
Calm 49.9%
Confused 49.6%

AWS Rekognition

Age 19-36
Gender Female, 50.1%
Disgusted 49.5%
Sad 50.1%
Surprised 49.6%
Happy 49.5%
Angry 49.6%
Calm 49.7%
Confused 49.5%

AWS Rekognition

Age 27-44
Gender Female, 50.5%
Angry 49.6%
Happy 49.7%
Calm 49.8%
Surprised 49.6%
Sad 49.5%
Confused 49.6%
Disgusted 49.7%

AWS Rekognition

Age 45-65
Gender Male, 50.2%
Confused 49.5%
Surprised 49.5%
Calm 49.7%
Sad 49.8%
Happy 49.5%
Disgusted 49.8%
Angry 49.6%

AWS Rekognition

Age 19-36
Gender Female, 50.2%
Disgusted 49.6%
Sad 49.6%
Confused 49.7%
Happy 49.5%
Angry 49.7%
Surprised 49.5%
Calm 50%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Disgusted 49.5%
Calm 49.7%
Angry 49.6%
Sad 50%
Surprised 49.5%
Happy 49.6%
Confused 49.5%

Feature analysis

Amazon

Person 98.9%
Boat 77.2%

Categories

Imagga

paintings art 99.5%