Human Generated Data

Title

Portrait #21

Date

September 27, 2001

People

Artist: Kevin E. Bubriski, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2005.17

Copyright

© 2001 Kevin Bubriski

Human Generated Data

Title

Portrait #21

People

Artist: Kevin E. Bubriski, American born 1954

Date

September 27, 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2005.17

Copyright

© 2001 Kevin Bubriski

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.9
Person 98.9
Person 98.8
Person 98.5
Person 97.2
Indoors 93.7
Interior Design 93.7
Leisure Activities 80
Crowd 79.9
Dance Pose 79
Person 73.4
Performer 71.2
Finger 70.7
Face 67
Audience 58.3
Senior Citizen 57.6
Dance 57.3
Blonde 57.3
Girl 57.3
Woman 57.3
Female 57.3
Teen 57.3
Kid 57.3
Child 57.3
Tango 56
Overcoat 55.6
Clothing 55.6
Coat 55.6
Apparel 55.6

Clarifai
created on 2023-10-25

monochrome 99.9
people 99.8
portrait 99.2
two 98.7
adult 98.6
woman 98.1
man 97.9
couple 96
black and white 95.7
girl 95.3
wear 94.9
group 93.6
three 93.4
music 92.2
street 90.8
retro 90.1
facial expression 90
son 89.1
child 88.5
comedy 88.1

Imagga
created on 2022-01-09

world 28.3
people 25.7
kin 23.9
person 22.7
man 22.2
statue 20.7
old 18.8
adult 18.2
portrait 17.5
black 17
couple 15.7
male 15
face 14.9
love 14.2
human 13.5
outdoor 12.2
outdoors 11.9
one 11.9
sculpture 11.9
vintage 11.6
brass 11.2
hair 11.1
lifestyle 10.8
planner 10.7
happy 10.7
antique 10.4
happiness 10.2
family 9.8
sepia 9.7
home 9.6
women 9.5
art 9.3
city 9.1
wind instrument 9
bride 8.6
day 8.6
eyes 8.6
smile 8.6
mother 8.5
musical instrument 8.5
two 8.5
senior 8.4
park 8.3
spectator 8.3
blond 8.2
aged 8.1
lady 8.1
romantic 8
looking 8
smiling 8
men 7.7
elderly 7.7
husband 7.6
hand 7.6
dark 7.5
relationship 7.5
mature 7.4
street 7.4
light 7.4
detail 7.2
religion 7.2
romance 7.1
to 7.1
together 7

Microsoft
created on 2022-01-09

text 99.2
clothing 95.9
human face 95
window 94.4
person 91.1
woman 80.6
black and white 73
street 51.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-46
Gender Female, 96.5%
Calm 65.4%
Sad 32.6%
Confused 0.5%
Disgusted 0.4%
Surprised 0.3%
Angry 0.3%
Fear 0.3%
Happy 0.1%

AWS Rekognition

Age 13-21
Gender Female, 96.7%
Angry 49.9%
Calm 20.2%
Disgusted 12.8%
Fear 8.6%
Confused 3.4%
Sad 3%
Surprised 1.5%
Happy 0.7%

AWS Rekognition

Age 25-35
Gender Male, 92.9%
Calm 42.8%
Confused 39.9%
Angry 6.8%
Fear 2.8%
Sad 2.7%
Surprised 2.1%
Disgusted 2%
Happy 0.9%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 99.8%