Human Generated Data

Title

Untitled (doll and suitcase)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1962

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (doll and suitcase)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1962

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 97.7
Person 96.7
Human 96.7
Table 90.8
Desk 86
LCD Screen 71.8
Electronics 71.8
Screen 71.8
Monitor 71.8
Display 71.8
Wood 61
Liquor 59.9
Alcohol 59.9
Beverage 59.9
Drink 59.9
Bottle 59.4
Pc 57
Computer 57

Clarifai
created on 2023-10-25

people 99.7
monochrome 99.2
street 96.4
analogue 95.7
still life 93.9
portrait 92.6
man 91.9
airplane 91.4
war 91.1
furniture 90.6
two 89.5
vintage 89.1
one 88.8
music 88.5
guitar 87.8
studio 86.7
dog 86.2
portfolio 86.1
bottle 85.1
analog 84.8

Imagga
created on 2022-01-08

musical instrument 25.1
adult 22
man 21.5
table 20.8
people 20.1
person 19.7
male 16.3
lifestyle 15.9
working 15
tool 14.6
device 14.3
interior 14.1
work 14.1
equipment 14.1
indoors 14.1
home 12.8
spatula 12.7
room 12.6
glass 12.4
black 12
attractive 11.9
chair 11.6
accordion 11.3
happy 11.3
wind instrument 11.2
microphone 11.1
face 10.6
engine 10.6
pretty 10.5
portrait 10.3
sitting 10.3
professional 10.2
smiling 10.1
house 10
smile 10
hand 9.9
handsome 9.8
job 9.7
keyboard instrument 9.3
modern 9.1
office 9
drum 8.9
looking 8.8
women 8.7
stringed instrument 8.7
desk 8.6
turner 8.6
home appliance 8.6
repair 8.6
cute 8.6
party 8.6
men 8.6
appliance 8.3
wine 8.3
music 8.2
cup 8.1
kitchen 8
garage 7.9
cooking utensil 7.8
break 7.6
instrument 7.6
drink 7.5
vehicle 7.5
car 7.5
camera 7.5
one 7.5
computer 7.4
glasses 7.4
entertainment 7.4
lady 7.3
business 7.3
seat 7.1
happiness 7

Microsoft
created on 2022-01-08

indoor 98.3
wall 96.8
text 94.6
black and white 90.8
bottle 72.4
white 62.2
cluttered 36.2
several 10.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 0-3
Gender Female, 100%
Calm 99.9%
Sad 0%
Surprised 0%
Disgusted 0%
Angry 0%
Happy 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.7%

Categories

Captions

Microsoft
created on 2022-01-08

a cluttered desk 80.3%
a close up of a cluttered desk 78.8%
a cluttered desk with a laptop 67.8%

Text analysis

Amazon

CLEAR
MAN
TOMI
UNGERER
TUR
ST-OLEUM
Aust
1 MAN
WE
-
I
1
CDN
STRANG
HPW

Google

Aus ST-OLEUM CLEAR TUR TRA MAN TOMI UNGERER
Aus
ST-OLEUM
CLEAR
TUR
TRA
MAN
TOMI
UNGERER