Human Generated Data

Title

Untitled (family on porch, possibly Pulaski County, Arkansas)

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3046

Human Generated Data

Title

Untitled (family on porch, possibly Pulaski County, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3046

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Chair 99.9
Furniture 99.9
Person 98.7
Human 98.7
Person 98.5
Sitting 88.9
Clothing 83.6
Apparel 83.6
Shorts 75.4
Home Decor 71.6
Person 67.6
People 64.2
Face 58.3
Linen 57.5

Clarifai
created on 2023-10-15

people 100
adult 99.3
two 98.8
child 98.1
group together 97.5
portrait 97.4
woman 97.2
group 96.9
three 96.9
offspring 96.3
wear 95.7
man 95.6
recreation 94.3
furniture 91.4
family 90.7
four 90
chair 89.8
monochrome 89.6
boy 88.3
seat 88.1

Imagga
created on 2021-12-15

person 28.3
portrait 26.5
fashion 25.6
people 25.1
sexy 24.9
model 23.3
attractive 23.1
pretty 21.7
adult 21.5
hair 21.4
body 20
lady 19.5
skin 17.8
black 16.9
sensuality 16.4
dress 16.3
posing 16
style 15.6
human 15
clothing 14.7
studio 14.4
elegance 14.3
elegant 13.7
youth 13.6
face 13.5
lifestyle 13
cute 12.9
sitting 12.9
erotic 12.5
garment 11.6
legs 11.3
happy 11.3
child 11
blond 10.4
women 10.3
makeup 10.1
teenager 10
sensual 10
expression 9.4
hand 9.1
summer 9
one 9
naked 8.7
chair 8.6
smile 8.6
old 8.4
skirt 8.3
street 8.3
pose 8.2
gorgeous 8.2
water 8
lovely 8
product 7.8
nude 7.8
mother 7.7
shoes 7.7
fashionable 7.6
dark 7.5
clothes 7.5
vintage 7.4
newspaper 7.3
make 7.3
parent 7.2
covering 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.7
clothing 95.4
outdoor 94.8
person 92.6
footwear 89.5
smile 58.7
human face 51

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-46
Gender Male, 79.6%
Sad 43.5%
Calm 36.7%
Fear 7.7%
Angry 4.6%
Confused 3.6%
Happy 2.1%
Surprised 1.4%
Disgusted 0.4%

AWS Rekognition

Age 21-33
Gender Female, 77.1%
Fear 73%
Sad 12.6%
Calm 6.1%
Surprised 3.9%
Happy 2.1%
Angry 1%
Confused 0.8%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

paintings art 99.9%