Human Generated Data

Title

Leaving Arkansas

Date

1985

People

Artist: Ken Miller, American born 1958 ?

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P1987.7

Human Generated Data

Title

Leaving Arkansas

People

Artist: Ken Miller, American born 1958 ?

Date

1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P1987.7

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Plant 99.2
Fruit 92
Food 92
Sphere 72.7
Produce 61.7
Apple 58.3
Finger 55.5

Clarifai
created on 2023-10-25

people 99.8
monochrome 99.5
portrait 98.8
child 98.6
street 97.8
one 96.1
man 94.8
adult 94.2
girl 93.5
market 93.4
woman 93.4
boy 92.1
son 91.9
food 86
baby 83.1
black and white 82.8
two 82.5
war 80.5
group 78.6
art 78.1

Imagga
created on 2022-01-09

pumpkin 78.8
vegetable 52.3
fruit 52.2
abacus 50.8
food 50.7
produce 46.8
calculator 41.1
autumn 33.4
fall 27.2
harvest 26.3
fresh 24.2
seasonal 22.8
nut 22.4
natural 22.1
squash 21.9
season 21.8
orange 21.5
close 21.1
healthy 20.8
device 20.2
diet 20.2
nutrition 20.1
organic 19.3
brown 19.1
nectarine 18.9
chestnut 18.8
edible fruit 18.7
seed 18.3
delicious 17.3
ripe 17.2
sweet 15.8
agriculture 15.8
snack 15.4
market 15.1
plant 14.8
vitamin 14.6
apple 14.5
basket 14.2
health 13.9
fruits 13.2
decoration 13
pile 12.2
closeup 12.1
vegetables 11.8
color 11.7
nuts 11.7
colorful 11.5
ingredient 11.4
yellow 11.3
juicy 10.9
eat 10.9
pumpkins 10.8
farm 10.7
dessert 10.6
vitamins 10.5
festive 10.2
dumbbell 10.1
tasty 10
chestnuts 9.9
weight 9.9
vegetarian 9.8
thanksgiving 9.8
peach 9.7
black 9.6
grapes 9.6
edible nut 9.5
round 9.5
gourmet 9.3
holiday 9.3
dark 9.2
summer 9
gourd 8.9
leaf 8.6
juice 8.5
heap 8.5
freshness 8.3
group 8.1
raw 8
lifestyle 7.9
leaves 7.9
outside 7.7
plate 7.6
nutritious 7.6
traditional 7.5
object 7.3
hazelnut 7.2
sports equipment 7
life 7
wooden 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 93.7
fruit 85.8
black and white 78.8
food 67.2
human face 62.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-19
Gender Female, 100%
Sad 70.9%
Disgusted 17.2%
Confused 3.6%
Angry 3.5%
Calm 3.2%
Fear 1%
Happy 0.3%
Surprised 0.3%

Microsoft Cognitive Services

Age 13
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

food drinks 91.6%
people portraits 4.5%
paintings art 2.5%

Text analysis

Amazon

LB
OLE LB $
$
OLE

Google

40 LB
40
LB