Human Generated Data

Title

Untitled (woman with boy and girl on couch)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21732

Human Generated Data

Title

Untitled (woman with boy and girl on couch)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21732

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 100
Apparel 100
Person 98.6
Human 98.6
Coat 98.3
Person 98.1
Person 95.4
Furniture 92.6
Shoe 91.1
Footwear 91.1
Chair 79.5
Shoe 72.9
Person 68.7
Jacket 64.9
Overcoat 63.9
Sunglasses 63.3
Accessories 63.3
Accessory 63.3
Suit 57.5
Raincoat 57.4
Shoe 56

Clarifai
created on 2023-10-22

people 99.8
adult 97.8
monochrome 97.6
man 95.7
portrait 95.6
wear 95.2
woman 94.6
two 93.9
group 93.9
chair 93.4
group together 93.1
wedding 92.6
sit 91.8
three 90.7
furniture 90.6
facial expression 87.8
retro 85.5
seat 84.1
child 83.4
four 81.4

Imagga
created on 2022-03-11

people 30.7
man 29.5
person 26.1
adult 23.9
male 20.6
men 16.3
happy 16.3
fashion 15.1
couple 14.8
sitting 14.6
sexy 14.4
two 14.4
happiness 14.1
business 14
model 13.2
women 12.6
professional 12.6
pretty 12.6
attractive 12.6
black 11.4
style 11.1
hair 11.1
love 11
portrait 11
dark 10.8
together 10.5
work 10.3
smiling 10.1
lifestyle 10.1
suit 10
human 9.7
fun 9.7
body 9.6
room 9.4
smile 9.3
lady 8.9
group 8.9
clothing 8.8
indoors 8.8
passion 8.5
office 8
handsome 8
looking 8
sport 8
home 8
worker 8
businessman 7.9
face 7.8
corporate 7.7
casual 7.6
executive 7.6
one 7.5
vintage 7.4
training 7.4
cheerful 7.3
indoor 7.3
sensual 7.3
table 7.2
romantic 7.1
family 7.1
posing 7.1
job 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 98.8
person 95
clothing 94.4
outdoor 85.2
dance 84.8
footwear 84.7
smile 72.7
posing 58.3
man 55

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 98.1%
Calm 79.9%
Happy 14.3%
Surprised 2.4%
Disgusted 1.1%
Confused 0.9%
Angry 0.6%
Sad 0.5%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Female, 83%
Calm 76.1%
Surprised 19.2%
Happy 3.3%
Confused 0.5%
Disgusted 0.5%
Sad 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Male, 89.3%
Happy 70.9%
Calm 25.9%
Surprised 2%
Confused 0.4%
Disgusted 0.3%
Sad 0.3%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Sunglasses
Person 98.6%
Person 98.1%
Person 95.4%
Person 68.7%
Shoe 91.1%
Shoe 72.9%
Shoe 56%
Sunglasses 63.3%

Categories

Imagga

people portraits 90.9%
paintings art 7.5%

Text analysis

Amazon

KODAK-
TTA