Human Generated Data

Title

Untitled (photograph of Gittings photo: "Aphrodite")

Date

c. 1970

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13391

Human Generated Data

Title

Untitled (photograph of Gittings photo: "Aphrodite")

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13391

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.5
Apparel 99.5
Person 95.1
Human 95.1
Head 93.8
Bonnet 81.2
Hat 81.2
Art 77.8
Face 71
Photography 65.5
Photo 65.5
Portrait 64.9
Monitor 64.6
Electronics 64.6
Screen 64.6
Display 64.6
Advertisement 61.8
Collage 59.2
Poster 59.2
Home Decor 56.1
Painting 55.5

Clarifai
created on 2023-10-27

portrait 98.8
people 98.5
painting 98.1
woman 97.3
picture frame 97.3
sepia 97.2
one 96
baby 95.6
museum 95.5
family 95.5
child 95.2
art 94.6
adult 93.1
girl 92.5
indoors 92.1
retro 90.4
two 90
moment 89.8
love 89.6
nostalgia 89.2

Imagga
created on 2022-01-23

television 100
telecommunication system 100
monitor 49.4
screen 48.2
display 41.8
technology 37.1
computer 34.1
electronic 28
equipment 23.3
flat 22.2
laptop 21.1
modern 21
business 20.6
plasma 16.5
broadcasting 15.6
digital 15.4
frame 15.4
design 15.2
communication 15.1
desktop 14.4
object 13.9
black 13.8
entertainment 13.8
office 13.7
person 13.6
wide 13.4
people 12.8
space 12.4
silver 12.4
media 12.4
network 12
blank 12
liquid crystal display 11.9
work 11.8
telecommunication 11.7
video 11.6
notebook 11.6
panel 11.6
tech 11.4
keyboard 11.3
smiling 10.8
happy 10.7
information 10.6
home 10.4
crystal 10.4
symbol 10.1
global 10
liquid crystal 9.9
visual 9.6
electronics 9.5
mobile 9.4
3d 9.3
smile 9.3
face 9.2
one 9
portable 8.7
women 8.7
wireless 8.6
pretty 8.4
presentation 8.4
studio 8.4
style 8.2
gray 8.1
happiness 7.8
liquid 7.8
medium 7.8
movie 7.7
set 7.6
web 7.6
showing 7.5
inside 7.4
data 7.3
portrait 7.1
world 7.1
copy 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

monitor 99.4
human face 98.3
text 95.4
person 94.7
indoor 94
woman 90
screen 88.4
clothing 87.3
painting 74.7
black 73.2
portrait 69.6
white 67.1
gallery 61.5
smile 51.3
old 49.6
image 35.4
picture frame 25.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Female, 100%
Calm 96%
Happy 1.9%
Confused 0.6%
Surprised 0.5%
Sad 0.4%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Monitor
Person 95.1%
Monitor 64.6%

Categories

Captions

Text analysis

Amazon

GITINGS
PAUL LISWOOD GITINGS
PAUL
APHRODITE
LISWOOD

Google

PIRODITE
PIRODITE