Human Generated Data

Title

Time and Thought, New York City

Date

1976

People

Artist: Paul Diamond, American 1942 - 2017

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.241

Human Generated Data

Title

Time and Thought, New York City

People

Artist: Paul Diamond, American 1942 - 2017

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.241

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.8
Person 99.8
Hair 98.7
Railing 89.8
Metropolis 84.5
Building 84.5
Urban 84.5
City 84.5
Town 84.5
Sunglasses 65.9
Accessories 65.9
Accessory 65.9
Vehicle 57.9
Transportation 57.9
Watercraft 56.8
Vessel 56.8

Clarifai
created on 2023-10-25

people 99.5
portrait 99
monochrome 98.5
street 97.1
one 96.5
adult 96.5
woman 95.7
music 94.7
art 91.9
man 91.4
two 91.3
girl 90.2
black and white 89.8
vintage 86.6
sea 86.1
reflection 85.9
beach 85.2
retro 85.1
wear 83.8
analogue 82.5

Imagga
created on 2022-01-09

billboard 45.8
signboard 37.1
structure 27
person 22.8
portrait 22.6
black 21.1
adult 20.7
people 20.6
hair 20.6
attractive 18.9
sexy 17.7
lifestyle 17.3
face 16.3
model 16.3
man 16.1
human 15.7
one 15.7
pretty 15.4
body 15.2
posing 15.1
dark 15
male 14.9
lighting 14.7
sensuality 14.5
fashion 14.3
silhouette 14.1
studio 12.9
park 12.8
urban 11.4
happy 11.3
binoculars 11.2
women 11.1
equipment 10.9
relaxation 10.9
gorgeous 10.9
light 10.7
lady 10.5
world 10.4
sitting 10.3
skin 10.2
cute 10
sensual 10
optical instrument 9.9
sky 9.8
apparatus 9.6
smile 9.3
slim 9.2
outdoor 9.2
device 9.2
alone 9.1
sun 8.7
water 8.7
television 8.6
expression 8.5
head 8.4
building 8.4
makeup 8.2
healthy 8.2
brunette 7.8
shower 7.8
instrument 7.5
city 7.5
clothing 7.5
outdoors 7.5
style 7.4
pose 7.2
smiling 7.2
dress 7.2
wet 7.2
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 96.3
water 94.1
text 91
black and white 86.7
sky 86.7
ship 82.8
clothing 77.2
woman 70
cloud 64.7
human face 50.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Male, 98%
Calm 42.4%
Sad 30.9%
Confused 9.6%
Surprised 7%
Fear 6.6%
Angry 1.6%
Disgusted 1.4%
Happy 0.5%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Sunglasses 65.9%

Categories

Imagga

paintings art 97.9%

Captions

Microsoft
created on 2022-01-09

a woman sitting on a bed 71.4%
a woman sitting in a room 71.3%
a woman posing for a picture 71.2%

Text analysis

Amazon

TES
THE
NDS
PANCHROMATIC
VE
80

Google

PANCHROMATIC
PANCHROMATIC