Human Generated Data

Title

Portrait #22

Date

September 27, 2001

People

Artist: Kevin E. Bubriski, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2005.18

Copyright

© 2001 Kevin Bubriski

Human Generated Data

Title

Portrait #22

People

Artist: Kevin E. Bubriski, American born 1954

Date

September 27, 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2005.18

Copyright

© 2001 Kevin Bubriski

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.1
Human 99.1
Person 99
Interior Design 98.2
Indoors 98.2
Person 94.7
Person 93.5
Dance Pose 93.4
Leisure Activities 93.4
Person 90.5
Face 89.3
Performer 83.8
Dance 79.8
Crowd 75.4
Hat 74.5
Clothing 74.5
Apparel 74.5
Finger 67.3
Tango 66
People 62.9
Blonde 61.6
Female 61.6
Teen 61.6
Kid 61.6
Girl 61.6
Woman 61.6
Child 61.6
Person 60.8
Hug 60
Portrait 59.4
Photography 59.4
Photo 59.4
Hair 58.2
Audience 57.1

Clarifai
created on 2023-10-25

monochrome 99.9
people 99.9
portrait 99.5
adult 98.7
man 98.5
two 98.1
street 98
black and white 97.7
woman 97.5
couple 96.9
group 96.6
music 96.3
three 94.3
comedy 93.3
musician 92.4
girl 91.1
son 91.1
facial expression 91.1
child 90.7
singer 89.7

Imagga
created on 2022-01-09

man 30.9
person 28
male 27.7
people 21.8
portrait 20.1
black 20
adult 19.5
human 16.5
old 14.6
one 14.2
world 13.6
senior 13.1
hair 12.7
business 12.1
statue 11.8
product 11.8
creation 11.6
face 10.7
wind instrument 10.5
looking 10.4
men 10.3
love 10.3
brass 10.1
businessman 9.7
expression 9.4
model 9.3
sculpture 9.3
head 9.2
dark 9.2
hand 9.1
art 9.1
office 9
body 8.8
happy 8.8
professional 8.6
sitting 8.6
musical instrument 8.4
mature 8.4
leisure 8.3
silhouette 8.3
alone 8.2
show 8.2
grandfather 8.2
religion 8.1
movie 8
light 8
sax 7.9
antique 7.8
eyes 7.7
attractive 7.7
vintage 7.4
lifestyle 7.2
suit 7.2
handsome 7.1
smile 7.1
family 7.1
work 7.1
scholar 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.8
human face 96.5
person 93.7
clothing 91.8
smile 84.6
man 77
black and white 67.8
woman 65.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Female, 99.2%
Sad 47.6%
Calm 42.4%
Fear 7.2%
Disgusted 1.1%
Confused 0.8%
Angry 0.5%
Surprised 0.3%
Happy 0.2%

AWS Rekognition

Age 42-50
Gender Female, 96.7%
Calm 72.3%
Angry 15.6%
Confused 3.8%
Sad 3.2%
Surprised 1.8%
Fear 1.3%
Happy 1%
Disgusted 0.9%

AWS Rekognition

Age 36-44
Gender Female, 99.9%
Happy 59.1%
Calm 28%
Surprised 3.2%
Sad 3.1%
Fear 2.5%
Disgusted 1.7%
Confused 1.6%
Angry 0.8%

AWS Rekognition

Age 28-38
Gender Male, 98.8%
Calm 84.4%
Fear 7.8%
Sad 2.4%
Confused 2.4%
Angry 1.1%
Happy 0.7%
Disgusted 0.6%
Surprised 0.5%

Microsoft Cognitive Services

Age 55
Gender Male

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Likely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Hat 74.5%

Categories

Imagga

paintings art 99.7%