Human Generated Data

Title

Untitled (young girl and young boy posed sitting on ottoman with dog beneath it)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9762

Human Generated Data

Title

Untitled (young girl and young boy posed sitting on ottoman with dog beneath it)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Human 99
Person 99
Person 98.9
Floor 95.8
Clothing 94.5
Apparel 94.5
Flooring 92.7
Female 86
Chair 83.5
Furniture 83.5
Shorts 80.7
Face 75.7
Girl 73
Portrait 67.5
Photography 67.5
Photo 67.5
Kid 66.6
Child 66.6
Sitting 65.5
Dress 65.3
Woman 63.9
Indoors 63.9
Door 63.7
Art 62.6
Drawing 62.2
People 61.5
Curtain 56.9
Footwear 56
Shoe 56

Imagga
created on 2022-01-24

brass 43.7
wind instrument 32.9
person 26.3
musical instrument 23.3
people 21.2
trombone 21.1
body 20.8
human 19.5
adult 18.9
mask 18.6
negative 16.4
man 15.4
sexy 15.3
film 15.1
fashion 14.3
portrait 14.2
male 14.2
posing 13.3
black 13.3
attractive 13.3
covering 12.9
clothing 12.5
face 12.1
helmet 11.9
pretty 11.9
figure 11.8
sport 11.8
model 11.7
holding 11.6
device 11.2
style 11.1
action 11.1
cornet 11.1
pose 10.9
fitness 10.8
lifestyle 10.8
active 10.1
art 10.1
dark 10
football helmet 10
exercise 10
hair 9.5
photographic paper 9.5
power 9.2
modern 9.1
sensual 9.1
lady 8.9
costume 8.9
happy 8.8
women 8.7
ball 8.6
expression 8.5
skin 8.5
studio 8.4
fun 8.2
stylish 8.1
bass 8
science 8
conceptual 7.9
play 7.8
3d 7.7
anatomy 7.7
war 7.7
world 7.6
healthy 7.6
head 7.6
lamp 7.5
music 7.4
make 7.3
headdress 7.2
dancer 7.1
to 7.1

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

text 97.8
clothing 83.2
person 82.7
human face 51.6

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Female, 99%
Calm 64.4%
Surprised 19.3%
Fear 11.4%
Happy 3%
Sad 0.7%
Angry 0.5%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 28-38
Gender Female, 99.8%
Calm 74.4%
Happy 23.6%
Surprised 1.3%
Sad 0.3%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Confused 0.1%

Feature analysis

Amazon

Person 99%
Shoe 56%

Captions

Microsoft

a person standing in front of a window 58.7%
a person standing next to a window 46.4%
a person standing in front of a window 46.3%

Text analysis

Amazon

KODVK
COVEETA--EITW