Human Generated Data

Title

Untitled (two women in dresses)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19251

Human Generated Data

Title

Untitled (two women in dresses)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19251

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.9
Human 98.9
Person 98.1
Clothing 96.6
Apparel 96.6
Plant 94.1
Potted Plant 91.6
Vase 91.6
Jar 91.6
Pottery 91.6
Sunglasses 89.8
Accessories 89.8
Accessory 89.8
Tree 84.9
Shoe 82.5
Footwear 82.5
Sleeve 72.4
Door 69.2
Flower 67.2
Blossom 67.2
Overcoat 66.8
Coat 66.8
Planter 60.8
Evening Dress 57.8
Fashion 57.8
Gown 57.8
Robe 57.8
Flooring 56.5
Suit 56.3
Flower Arrangement 55.9

Clarifai
created on 2023-10-22

people 99.5
woman 97.3
monochrome 95.2
adult 94.3
one 93.4
indoors 91.4
family 90.7
two 89.8
wear 88.6
man 85.4
wedding 85
window 81.5
portrait 79.6
room 79.1
house 78.8
lid 75.4
dress 75.1
actress 75
administration 74.5
child 74

Imagga
created on 2022-03-05

barbershop 47.1
hairdresser 45.9
shop 40.5
barber chair 36.9
chair 36.8
salon 30.4
mercantile establishment 28.4
people 27.9
seat 24.6
man 24.2
indoors 22.8
male 22
person 19.9
place of business 19
adult 18.8
interior 18.6
home 18.3
men 18
furniture 18
business 17
happy 16.9
window 16.5
women 15.8
portrait 15.5
pretty 14.7
room 13.6
house 13.4
attractive 13.3
businessman 13.2
office 13
lifestyle 12.3
fashion 12.1
two 11.9
happiness 11.7
family 11.6
lady 11.4
couple 11.3
professional 11.1
dress 10.8
working 10.6
cleaner 10.5
looking 10.4
work 10.2
inside 10.1
indoor 10
worker 9.9
modern 9.8
old 9.8
human 9.7
establishment 9.5
corporate 9.4
smiling 9.4
smile 9.3
black 9
cheerful 8.9
lovely 8.9
love 8.7
equipment 8.5
holding 8.3
alone 8.2
one 8.2
style 8.2
domestic 7.8
face 7.8
bride 7.7
casual 7.6
senior 7.5
furnishing 7.4
wedding 7.4
building 7.4
suit 7.2
cute 7.2
hair 7.1
job 7.1
travel 7

Google
created on 2022-03-05

Plant 93.6
Black 89.7
Flowerpot 88.6
Black-and-white 85.1
Hat 79.7
Snapshot 74.3
Houseplant 73.9
Monochrome photography 73.5
Monochrome 71.2
Event 66.9
Stock photography 66.5
Vintage clothing 65.3
Room 65
Art 62.2
Mirror 61.5
Glass 57.7
Rectangle 56.4
History 54.8
Suit 54.4
Door 54.1

Microsoft
created on 2022-03-05

clothing 96.1
dress 93.9
person 88
woman 87.5
black and white 79.6
text 64
footwear 53.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 92.3%
Happy 63.5%
Calm 31.3%
Sad 2.8%
Angry 0.6%
Disgusted 0.5%
Surprised 0.5%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Male, 88%
Sad 97.8%
Confused 1.8%
Calm 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Sunglasses
Shoe
Person 98.9%
Person 98.1%
Sunglasses 89.8%
Shoe 82.5%

Categories

Text analysis

Amazon

3
AGO
в
tirn
X004X
2017 EEAA tirn
EEAA
2017
20% EEAA ٤١٢٨
٤١٢٨
20%

Google

KODYH 2.7LEIA tirn KODYK 2.rEEIA tirn B
KODYH
2.7LEIA
tirn
KODYK
2.rEEIA
B