Human Generated Data

Title

Untitled (women sitting on chairs)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19453

Human Generated Data

Title

Untitled (women sitting on chairs)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19453

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.9
Furniture 99.9
Home Decor 98.3
Person 97.6
Human 97.6
Clothing 88.9
Apparel 88.9
Helmet 88.4
Person 88.1
Person 72.1
Linen 62.6
Living Room 58.2
Indoors 58.2
Room 58.2
Lamp 57
Person 44.6

Clarifai
created on 2023-10-22

people 99.8
dancing 97.4
two 96.5
music 96.1
adult 95.7
group together 95.3
actress 94.9
wear 94.9
woman 94.6
dancer 94.5
group 93.3
musician 93
man 92.7
chair 91
recreation 90.2
three 90.1
one 89.5
furniture 88.5
singer 86.5
dress 85.8

Imagga
created on 2022-03-05

person 24.6
people 20.6
man 20.6
adult 18.6
clothing 15.5
outdoors 14.9
lifestyle 14.4
attractive 14
fun 13.5
male 12.9
happy 12.5
leisure 12.4
fashion 12.1
sitting 12
sport 11.8
day 11.8
lady 11.4
sexy 11.2
casual 11
alone 10.9
women 10.3
smiling 10.1
active 10.1
playing 10
dress 9.9
old 9.7
portrait 9.7
spectator 9.4
legs 9.4
youth 9.4
game 9.3
competition 9.1
teenager 9.1
pretty 9.1
shoes 8.6
play 8.6
model 8.5
black 8.5
child 8.4
senior 8.4
outdoor 8.4
wicker 8.3
city 8.3
leg 8.3
sensual 8.2
fitness 8.1
indoors 7.9
love 7.9
bathing cap 7.8
couple 7.8
happiness 7.8
two 7.6
blond 7.6
joy 7.5
one 7.5
maillot 7.4
skirt 7.4
net 7.4
street 7.4
indoor 7.3
business 7.3
exercise 7.3
looking 7.2
body 7.2
activity 7.2
work 7.1
smile 7.1
interior 7.1
summer 7.1

Microsoft
created on 2022-03-05

furniture 95.4
text 95
table 94.6
chair 91.2
clothing 87.1
footwear 84.1
person 82
black and white 80.6
woman 66.5
house 51.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Surprised 93.4%
Calm 4.9%
Happy 0.4%
Fear 0.4%
Confused 0.4%
Sad 0.2%
Angry 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Helmet
Person 97.6%
Person 88.1%
Person 72.1%
Person 44.6%
Helmet 88.4%

Captions

Microsoft
created on 2022-03-05

a person sitting on a bed 47.6%
a person sitting on a chair 47.5%

Text analysis

Amazon

&
MAMTSA3