Human Generated Data

Title

Untitled (men arranging manequins in shop window)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14883

Human Generated Data

Title

Untitled (men arranging manequins in shop window)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Poster 99.8
Advertisement 99.8
Human 99.7
Person 99.7
Person 99.5
Clothing 99.3
Apparel 99.3
Person 98.7
Person 97.8
Shorts 93.2
Person 91.1
Female 81.2
Collage 73.1
Overcoat 72.7
Suit 72.7
Coat 72.7
Person 71.7
Face 70.1
Woman 68.6
Dress 66.7
Door 64.8
Person 61.3
Dessert 58.5
Creme 58.5
Cream 58.5
Icing 58.5
Food 58.5
Cake 58.5
Girl 55.9
Skirt 55.1

Imagga
created on 2022-01-29

shop 30.1
newspaper 24.6
barbershop 23.8
people 21.7
man 21.5
adult 20.4
product 19.8
kin 19.6
mercantile establishment 18.9
window 18
home 17.5
portrait 17.5
interior 16.8
couple 16.5
family 16
person 15.6
male 15
happiness 14.9
creation 14.8
happy 14.4
mother 14.1
room 14
life 14
dress 13.5
house 13.4
indoor 12.8
business 12.8
place of business 12.5
women 11.9
casual 11.9
love 11.8
smile 11.4
urban 11.4
fashion 11.3
door 11
two 11
clothing 10.9
bride 10.5
parent 10.2
wedding 10.1
city 10
new 9.7
looking 9.6
men 9.4
sliding door 9.4
wall 9.4
vintage 9.1
pretty 9.1
human 9
businessman 8.8
indoors 8.8
mall 8.8
light 8.7
smiling 8.7
luxury 8.6
glass 8.6
face 8.5
buy 8.4
design 8.4
black 8.4
modern 8.4
future 8.4
inside 8.3
shopping 8.3
child 8.2
groom 8.1
decoration 8
lifestyle 7.9
standing 7.8
attractive 7.7
old 7.7
building 7.5
style 7.4
sale 7.4
alone 7.3
romantic 7.1
daughter 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.7
clothing 98.5
person 97.2
window 94.8
footwear 92.4
standing 82.6
man 79.4
gallery 72.5
store 41.4

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 94.7%
Calm 58.4%
Happy 26.5%
Fear 6.7%
Sad 3.7%
Surprised 2%
Angry 1.4%
Disgusted 0.9%
Confused 0.5%

AWS Rekognition

Age 27-37
Gender Female, 88.7%
Calm 67.5%
Happy 30.3%
Fear 1.1%
Disgusted 0.4%
Sad 0.3%
Surprised 0.2%
Angry 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Poster 99.8%
Person 99.7%

Captions

Microsoft

a group of people standing in front of a window 93.1%
a group of people standing in front of a store window 85.6%
a man standing in front of a window 85.5%

Text analysis

Amazon

HAMJRA39
RUA
100 RUA HAMJRA39
100
SUBATRA
TROMAIC
and
Brahlin

Google

UBATES
29A3VOE
MAIC
UBATES 29A3VOE MAIC