Human Generated Data

Title

Untitled (people getting off boat, dressed for luau)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16752

Human Generated Data

Title

Untitled (people getting off boat, dressed for luau)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16752

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Clothing 99.5
Apparel 99.5
Person 99.5
Person 99.4
Shorts 99.4
Person 98.1
Female 90.2
Person 85.1
Outdoors 84.2
Face 84.1
Nature 83.3
Sea 82.7
Ocean 82.7
Water 82.7
People 79.7
Dress 78.8
Shoreline 78.6
Beach 77.4
Coast 77.4
Woman 75.7
Meal 74.2
Food 74.2
Person 70.8
Portrait 69.4
Photography 69.4
Photo 69.4
Girl 66.3
Shoe 66
Footwear 66
Kid 64.8
Child 64.8
Man 58.7
Skirt 57.5
Crowd 55.1

Clarifai
created on 2023-10-29

people 99.9
group together 98.3
adult 98.2
group 97.4
woman 96.3
man 95.6
child 95.4
wear 90.2
recreation 88.5
administration 87.8
several 87
war 86.9
four 83.3
five 82.3
many 81.4
outfit 81.4
leader 81
vehicle 80.7
military 79.8
three 79.8

Imagga
created on 2022-02-26

man 29.5
gun 25.2
person 25
people 20.6
danger 20
engineer 19.5
mask 18.5
male 18.4
protection 18.2
device 18.2
destruction 16.6
adult 15
smoke 14.9
soldier 14.7
military 14.5
industrial 13.6
nuclear 13.6
weapon 13.5
gas 13.5
outdoor 13
disaster 12.7
protective 12.7
men 12
radioactive 11.8
radiation 11.7
toxic 11.7
cannon 11.6
environment 11.5
rifle 11.1
safety 11
stalker 10.9
dirty 10.8
accident 10.7
brass 10.7
chemical 10.6
old 10.4
musical instrument 10.3
world 10.1
suit 9.9
travel 9.8
camouflage 9.8
outdoors 9.8
clothing 9.8
war 9.7
protect 9.6
wind instrument 9.6
sky 9.6
work 9.5
equipment 9.3
child 9.2
holding 9.1
steam 8.7
explosion 8.7
industry 8.5
uniform 8.5
sport 8.4
human 8.2
horn 8
grass 7.9
cancer 7.8
army 7.8
portrait 7.8
field 7.5
fire 7.5
respirator 7.5
silhouette 7.4
religion 7.2
life 7.1
to 7.1
summer 7.1

Microsoft
created on 2022-02-26

clothing 95.6
text 94.8
person 88.6
black and white 84.4
man 61
old 47.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 96.8%
Sad 35%
Happy 19.6%
Disgusted 18.6%
Confused 14.3%
Surprised 4.4%
Angry 4%
Calm 2%
Fear 2%

AWS Rekognition

Age 43-51
Gender Female, 99.6%
Sad 83.1%
Calm 15.3%
Happy 0.5%
Fear 0.4%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 99.5%
Person 99.4%
Person 98.1%
Person 85.1%
Person 70.8%
Shoe 66%

Captions

Text analysis

Amazon

46
.
Craft
. Craft
Dinox

Google

46
46