Human Generated Data

Title

Untitled (people getting off boat, dressed for luau)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16746

Human Generated Data

Title

Untitled (people getting off boat, dressed for luau)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16746

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Clothing 99.1
Apparel 99.1
Person 99
Person 97.5
Water 97.1
Person 97
Dress 96.6
Sea 95.8
Ocean 95.8
Outdoors 95.8
Nature 95.8
Person 95.2
Shoreline 94.2
Female 90.9
Person 89.1
Beach 86.7
Coast 86.7
Shorts 84.6
Face 76.3
People 76
Woman 75.3
Land 74.7
Waterfront 71.8
Girl 69.1
Pier 68.9
Dock 68.9
Port 68.9
Kid 66.7
Child 66.7
Portrait 65.9
Photography 65.9
Photo 65.9
Person 62.6
Crowd 62.6
Plant 61.3
Leisure Activities 60.3
Swimwear 58.7
Tree 58.2
Sand 55.3
Soldier 55.2
Military 55.2
Military Uniform 55.2

Clarifai
created on 2023-10-29

people 99.9
group 99.2
adult 98.8
group together 98.1
man 97.1
many 96.4
woman 94.2
watercraft 94.2
child 92.8
print 92.7
military 91.9
vehicle 91.7
transportation system 89.3
war 89
recreation 88.6
several 87.1
soldier 86.4
monochrome 86.1
wear 85
veil 84

Imagga
created on 2022-02-26

barrow 25.1
man 22.2
handcart 20.5
people 19.5
shovel 18.7
landscape 18.6
sky 18.5
wheeled vehicle 17.4
beach 16.6
outdoor 16.1
outdoors 16
male 15.6
travel 15.5
sunset 14.4
silhouette 14.1
water 14
vehicle 13
danger 12.7
destruction 12.7
ocean 12.4
adult 12.4
tree 12.3
person 11.9
winter 11.9
protection 11.8
hand tool 11.7
sport 11.7
river 11.6
tool 11.3
sun 11.3
building 11.2
summer 10.9
mountain 10.8
disaster 10.7
snow 10.5
old 10.4
conveyance 10.2
sea 10.2
gun 10.1
leisure 10
clothing 9.8
nuclear 9.7
sand 9.4
smoke 9.3
park 9.3
protective 8.8
day 8.6
cold 8.6
men 8.6
mask 8.2
industrial 8.2
dirty 8.1
camouflage 7.9
soldier 7.8
toxic 7.8
season 7.8
portrait 7.8
dusk 7.6
walking 7.6
field 7.5
dark 7.5
rifle 7.4
holding 7.4
tourism 7.4
environment 7.4
safety 7.4
vacation 7.4
lifestyle 7.2
black 7.2
coast 7.2
women 7.1
stone 7.1
architecture 7

Microsoft
created on 2022-02-26

text 97.8
black and white 92.1
drawing 77.4
monochrome 63.9
sketch 62.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 56.4%
Calm 99.4%
Sad 0.4%
Happy 0.1%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 7-17
Gender Male, 61.8%
Happy 38.4%
Calm 34.1%
Sad 12.9%
Disgusted 10.7%
Fear 1.5%
Angry 1.1%
Confused 0.9%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99%
Person 97.5%
Person 97%
Person 95.2%
Person 89.1%
Person 62.6%

Categories

Captions

Text analysis

Amazon

Craft
33
ALV
Chris Craft
Chris

Google

Caris Caft ALV 33
Caris
Caft
ALV
33