Human Generated Data

Title

Untitled (people serving food at Luau)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16437

Human Generated Data

Title

Untitled (people serving food at Luau)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Apparel 100
Clothing 100
Human 98.9
Person 98.9
Person 98.9
Animal 98.3
Bird 98.3
Person 97
Hat 96.6
Sun Hat 94
Person 93.5
Bird 74.8
Horse 72.7
Mammal 72.7
Furniture 68.1
Hat 65.4
Outdoors 61.7
Bonnet 59.9
Cowboy Hat 59
Person 58.2
Coat 58.2
Art 57.7

Imagga
created on 2022-02-11

building 18.9
hall 17.6
man 15.4
structure 14.4
house 14.2
tourism 14
product 13.6
chair 13.5
people 13.4
travel 13.4
person 12.8
table 12.7
architecture 12.6
newspaper 12.4
vacation 12.3
resort 12.1
work 12
outdoors 11.9
city 11.6
sky 11.5
hotel 11.4
male 11.3
holiday 10.7
patio 10.7
couple 10.4
summer 10.3
relaxation 10
outdoor 9.9
scene 9.5
creation 9.5
office 9.2
business 9.1
home 8.8
tree 8.7
wicker 8.4
beach 8.4
modern 8.4
evening 8.4
old 8.4
sea 8.2
laptop 8.2
landscape 8.2
tourist 8.1
room 8.1
musical instrument 8.1
sun 8
computer 8
job 8
interior 8
roof 7.6
relax 7.6
area 7.5
traditional 7.5
animal 7.4
center 7.3
lifestyle 7.2
love 7.1

Google
created on 2022-02-11

Hat 93.3
Black-and-white 85.2
Style 84.1
Art 82.5
Sun hat 81.1
Adaptation 79.3
Monochrome photography 76.6
Monochrome 76
Event 72.2
Room 71.8
Vintage clothing 70.5
Illustration 69.9
Visual arts 69.9
Painting 69.2
Font 66.9
Chair 65.1
History 63.4
Music 61.1
Table 59.5
Pattern 57.7

Microsoft
created on 2022-02-11

text 97.6
drawing 92.1
person 89
sketch 76
black and white 71.8
old 70.3
cartoon 69.8
hat 65.9
clothing 65.8

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 79.7%
Calm 68.1%
Happy 22.1%
Confused 2.5%
Sad 2.3%
Disgusted 1.8%
Surprised 1.7%
Angry 1.2%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Bird 98.3%
Hat 96.6%
Horse 72.7%

Captions

Microsoft

a group of people sitting in front of a window 76.6%
a group of people standing in front of a window 76.5%
a group of people sitting and standing in front of a window 76.4%

Text analysis

Amazon

GALLONS
JUGS
FOUR

Google

MJI7--
YT37A°2
XAGOX
MJI7-- YT37A°2 - - XAGOX
-