Human Generated Data

Title

Untitled (woman on chair in living room)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19452

Human Generated Data

Title

Untitled (woman on chair in living room)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19452

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.8
Furniture 99.8
Person 98
Human 98
Living Room 95.7
Room 95.7
Indoors 95.7
Home Decor 90.2
Clothing 72.2
Apparel 72.2
Person 68.1
Linen 68
Flooring 67.5
Rug 65.6
Armchair 59.5
Monitor 56.7
Electronics 56.7
Screen 56.7
Display 56.7
Floor 55.8
Couch 55.4
Dressing Room 55.1

Clarifai
created on 2023-10-22

people 99.9
chair 98.1
two 97.8
man 97.5
adult 96.9
music 96.1
group 95.4
musician 94.3
seat 93.1
group together 92.4
three 92.3
furniture 92.3
woman 91.8
leader 89
actress 88.4
sit 88.4
several 85.7
one 84.7
actor 83.7
wear 83.2

Imagga
created on 2022-03-05

wicker 60.5
work 39.1
product 35.7
net 26.8
creation 25
tennis 19.5
man 18.1
court 17.5
active 16.2
adult 15.8
people 15.6
person 14.9
hat 14.2
lifestyle 13.7
fitness 13.5
rattan 13.5
fun 13.5
sport 13.2
happy 13.1
outdoor 13
summer 12.9
outside 12.8
male 12.8
leisure 12.4
outdoors 11.9
portrait 11
racket 11
happiness 11
switch 10.7
clothing 10.6
healthy 10.1
competition 10.1
playing 10
exercise 10
dress 9.9
fashion 9.8
pretty 9.8
attractive 9.8
old 9.7
couple 9.6
athlete 9.5
youth 9.4
casual 9.3
smile 9.3
joy 9.2
hand 9.1
child 8.8
women 8.7
face 8.5
shopping 8.3
park 8.2
cheerful 8.1
lady 8.1
instrument of punishment 8.1
activity 8.1
game 8
love 7.9
health 7.6
player 7.5
senior 7.5
one 7.5
place 7.4
sports 7.4
fit 7.4
wedding 7.4
teenager 7.3
girls 7.3
smiling 7.2
ball 7.2
cute 7.2
recreation 7.2
hair 7.1
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

furniture 92
chair 91.8
text 89.6
black and white 86.4
musical instrument 79.2
clothing 75.8
person 75.3
table 64.1
footwear 61.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 87.3%
Calm 81.9%
Happy 14.4%
Sad 1.9%
Confused 0.6%
Disgusted 0.4%
Surprised 0.4%
Angry 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Rug
Person 98%
Person 68.1%
Rug 65.6%

Categories

Captions

Text analysis

Amazon

28
NAMTSA3
KAGOX NAMTSA3
KAGOX