Human Generated Data

Title

Untitled (woman holding flower centerpieces)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19423

Human Generated Data

Title

Untitled (woman holding flower centerpieces)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 97.8
Clothing 97.8
Sleeve 97.6
Human 96.3
Person 96.3
Long Sleeve 83.9
Evening Dress 81.1
Fashion 81.1
Robe 81.1
Gown 81.1
Text 73.4
Art 73.3
Drawing 73.3
Plant 72.6
Lace 70.2
Furniture 69.2
Tabletop 69.2
Face 67.3
Flower 63.3
Blossom 63.3
Female 60.2
Sketch 56.6
Table 56.1
Dress 55.2
Tree 55.1

Imagga
created on 2022-03-05

person 30.3
people 26.2
portrait 25.2
adult 23.1
model 22.5
pretty 21.7
attractive 20.3
human 20.2
hair 19.8
child 19.6
smile 18.5
sexy 18.5
fashion 18.1
happiness 18
happy 17.5
body 16.8
smiling 16.6
cute 15.8
fun 15.7
clothing 15.2
joy 15
face 14.2
one 14.2
standing 13.9
lifestyle 13
cheerful 13
man 12.8
leisure 11.6
lady 11.4
healthy 11.3
brunette 11.3
love 11
dress 10.8
garment 10.8
activity 10.7
little 10.6
outdoors 10.4
looking 10.4
expression 10.2
relaxation 10
childhood 9.8
women 9.5
sport 9.5
slim 9.2
makeup 9.1
world 9.1
male 9.1
active 9.1
blond 9.1
health 9
summer 9
style 8.9
posing 8.9
device 8.8
shower 8.7
water 8.7
elegance 8.4
pink 8.4
action 8.3
pose 8.1
youth 7.7
sky 7.6
black 7.6
studio 7.6
life 7.6
sibling 7.5
girls 7.3
sensual 7.3
sensuality 7.3
fitness 7.2
wet 7.1
family 7.1
kid 7.1
covering 7.1
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.3
wedding dress 95.2
dress 93.1
bride 87.3
clothing 82.8
woman 80.2
person 76.4
white 63.5
old 58

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 97.1%
Happy 75.3%
Calm 16.8%
Sad 2.8%
Fear 1.3%
Surprised 1.2%
Confused 1.1%
Disgusted 0.8%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%

Captions

Microsoft

a vintage photo of a man 87.5%
an old photo of a man 85.4%
old photo of a man 82.1%

Text analysis

Amazon

4
5
с
MAQOX
MJI7
MJI7 YT37 A°2
YT37
A°2
201
XOO
201 TELA kirn
TELA
kirn

Google

KODYK 2.YEEJA Eirn
Eirn
2.YEEJA
KODYK