Human Generated Data

Title

Untitled (woman sitting in chair in living room)

Date

1941

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21906

Human Generated Data

Title

Untitled (woman sitting in chair in living room)

People

Artist: Hamblin Studio, American active 1930s

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21906

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Furniture 99.7
Chair 99.6
Person 97.8
Human 97.8
Room 90
Indoors 90
Living Room 80.4
Nature 76.4
Outdoors 73.7
Couch 73.2
Interior Design 72
Ice 68.2
Bedroom 55.8

Clarifai
created on 2023-10-22

people 100
two 98.3
child 98.1
group 97.3
monochrome 97
adult 96.8
woman 96.2
street 95.6
group together 95.3
furniture 95.2
three 94.1
one 92.9
home 92.1
man 91.8
several 90.6
sit 90.2
four 89.7
family 89
many 88.7
chair 87.9

Imagga
created on 2022-03-11

barbershop 74.7
shop 59.6
mercantile establishment 45
chair 37.5
seat 30.7
place of business 30.3
architecture 29.8
city 25.8
building 25.7
street 23.9
old 23.7
house 23.4
window 21.6
barber chair 21.2
stone 19.1
furniture 19
travel 18.3
tourism 17.3
home 16.7
town 15.8
establishment 15.2
ancient 13.8
interior 13.3
urban 13.1
culture 12.8
room 12.3
wall 12
lamp 11.4
antique 11.3
classic 11.1
tourist 10.9
decoration 10.9
history 10.7
light 10.7
scene 10.4
brick 10.4
exterior 10.1
historic 10.1
road 9.9
religion 9.9
night 9.8
bench 9.5
church 9.2
tree 9.2
arch 8.7
palace 8.7
windows 8.6
balcony 8.6
column 8.4
design 8.4
place 8.4
people 8.4
color 8.3
traditional 8.3
vintage 8.3
landmark 8.1
indoors 7.9
structure 7.9
door 7.8
houses 7.7
luxury 7.7
living 7.6
buildings 7.6
historical 7.5
black 7.2
celebration 7.2
holiday 7.2

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 90.3
black and white 82.7
furniture 79.5
house 75.4
white 63.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 55.7%
Calm 62.9%
Sad 24%
Fear 7.9%
Disgusted 1.3%
Surprised 1.2%
Confused 1.1%
Happy 0.9%
Angry 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.8%

Categories

Captions

Microsoft
created on 2022-03-11

an old photo of a person 28.7%
a person standing in a room 28.6%
a group of people in a room 28.5%

Text analysis

Amazon

2
MJ13
MJ13 АЗДА
АЗДА

Google

TCEV 2VLEIA EITH
TCEV
2VLEIA
EITH