Human Generated Data

Title

Untitled (three women seated posing in room surroundede by flowers)

Date

1937

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9004

Human Generated Data

Title

Untitled (three women seated posing in room surroundede by flowers)

People

Artist: Martin Schweig, American 20th century

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9004

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99
Human 99
Person 98.8
Person 94.6
Collage 88.6
Advertisement 88.6
Poster 88.6
Clothing 75.2
Apparel 75.2
Face 74.1
Art 69.4
Female 66.1
People 62.2
Photography 60
Photo 60
Flyer 58.5
Brochure 58.5
Paper 58.5
Performer 58.4
Sleeve 57.9
Text 57.6
Leisure Activities 56.3

Clarifai
created on 2023-10-26

people 99.9
group 99.8
adult 97.2
many 96.8
woman 96.7
monochrome 95
child 94.9
interaction 94.5
illustration 93.8
music 92.9
man 92.9
several 91.9
leader 90
musician 87.8
wear 86.7
offspring 85.9
art 85.8
education 85.2
retro 82.8
dancing 82.8

Imagga
created on 2022-01-23

film 50.4
negative 39.9
photographic paper 36
groom 26.4
people 25.6
photographic equipment 24
x-ray film 22.5
person 22.2
man 21.5
adult 16.8
bride 16.4
male 15.6
newspaper 15.6
portrait 15.5
black 15
couple 13.9
love 12.6
old 12.5
happy 12.5
product 11.9
wedding 11.9
business 11.5
human 11.2
hair 11.1
office 10.4
women 10.3
aquarium 10.2
dress 9.9
blackboard 9.8
creation 9.7
two 9.3
smile 9.3
face 9.2
silhouette 9.1
vintage 9.1
pretty 9.1
one 9
businessman 8.8
looking 8.8
symbol 8.7
bridal 8.7
light 8.7
lifestyle 8.7
happiness 8.6
art 8.6
design 8.4
lady 8.1
sexy 8
sitting 7.7
attractive 7.7
frame 7.6
wife 7.6
head 7.6
senior 7.5
retro 7.4
back 7.3
alone 7.3
smiling 7.2
home 7.2
day 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
wedding dress 94.7
bride 90.9
person 89.9
dress 88.5
clothing 87.7
posing 87.1
woman 86.5
human face 81
smile 57.6
old 55.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 87.7%
Calm 55.2%
Sad 38.2%
Confused 2%
Happy 1.6%
Disgusted 1.1%
Angry 0.8%
Fear 0.6%
Surprised 0.4%

AWS Rekognition

Age 38-46
Gender Female, 99.4%
Happy 98.9%
Calm 0.5%
Angry 0.2%
Sad 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 47-53
Gender Male, 66.6%
Happy 97.3%
Calm 0.7%
Angry 0.5%
Surprised 0.5%
Confused 0.4%
Sad 0.3%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

paintings art 90.5%
text visuals 8.1%

Text analysis

Amazon

322
LISSE
LICH
علام
AM

Google

322 3 S S
322
3
S