Human Generated Data

Title

chris and kuka

Date

2001, printed 2019

People

Artist: Dylan Vitone, American born 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, 2019.119

Copyright

© the Artist

Human Generated Data

Title

chris and kuka

People

Artist: Dylan Vitone, American born 1978

Date

2001, printed 2019

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Fund for the Acquisition of Photographs, 2019.119

Copyright

© the Artist

Machine Generated Data

Tags

Amazon
created on 2019-04-17

Clothing 99.8
Shorts 99.8
Apparel 99.8
Person 99.7
Human 99.7
Person 98.5
Shoe 97.7
Footwear 97.7
Shoe 77.1
Tire 73.2
Vehicle 69.3
Transportation 69.3
Undershirt 69.3
Hat 63
Skin 59.3
Sleeve 56.8
Pants 55.1

Clarifai
created on 2019-04-17

people 100
adult 99
two 98.7
woman 98.1
group together 97.9
three 96.9
administration 96.1
man 95.7
group 95.7
four 94.4
several 93.2
wear 92.7
facial expression 90.5
offspring 90.3
actress 88.7
five 88.4
sibling 88.2
vehicle 86.8
actor 85.3
child 85

Imagga
created on 2019-04-17

person 27.4
people 26.8
adult 25.9
couple 23.5
happy 22.6
love 22.1
portrait 22
man 20.2
male 20.1
dress 19.9
attractive 19.6
happiness 18.8
car 17.3
bride 17.3
wedding 16.6
fashion 15.8
together 15.8
smile 15.7
mother 15.3
child 15.2
world 14.8
pretty 14.7
kin 14.6
outdoor 14.5
two 14.4
groom 14.4
cute 14.3
smiling 13.7
family 13.3
outside 12.8
women 12.6
lifestyle 12.3
life 11.6
clothing 11.6
park 11.5
married 11.5
girls 10.9
holding 10.7
fun 10.5
old 10.4
marriage 10.4
summer 10.3
day 10.2
model 10.1
elegance 10.1
vehicle 10.1
face 9.9
sexy 9.6
automobile 9.6
religious 9.4
hand 9.1
human 9
umbrella 9
lady 8.9
posing 8.9
home 8.8
hair 8.7
men 8.6
joy 8.4
skin 7.7
expression 7.7
outdoors 7.6
leisure 7.5
street 7.4
20s 7.3
religion 7.2
romantic 7.1
little 7.1

Google
created on 2019-04-17

Photograph 96.3
People 93.4
Snapshot 89.5
Standing 85.3
Vehicle 74.5
Car 72.3
Monochrome 70.2
Black-and-white 68.3
Street 64.9
Photography 62.4
Family 53.1
Family car 51.5
Style 51

Microsoft
created on 2019-04-17

person 99.4
outdoor 98.4
standing 86.9
posing 58.6
black and white 19.4
street 14.1
monochrome 4.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Male, 72.2%
Surprised 1.3%
Calm 93.1%
Disgusted 0.3%
Happy 0.2%
Sad 2.1%
Confused 2%
Angry 1.1%

AWS Rekognition

Age 11-18
Gender Male, 99.7%
Happy 4.3%
Surprised 10.8%
Calm 44.4%
Disgusted 2.6%
Angry 3.9%
Sad 4%
Confused 30.1%

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 20
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 97.7%

Text analysis

Google

PO1O. 1967
PO1O.
1967