Human Generated Data

Title

Untitled (women looking into mirror)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19256

Human Generated Data

Title

Untitled (women looking into mirror)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19256

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.9
Human 98.9
Person 98.5
Person 95.7
Person 94.2
Drawing 78.9
Art 78.9
Cat 70.3
Animal 70.3
Mammal 70.3
Pet 70.3
Interior Design 57.5
Indoors 57.5
Plot 57.4
Poster 55.6
Advertisement 55.6

Clarifai
created on 2023-10-22

people 99.8
adult 99.2
woman 98.7
one 98.2
two 98
portrait 97.9
wear 97.6
man 96.9
street 96.6
painting 93.4
art 93.3
facial expression 93.3
girl 92.8
music 92.6
family 91
furniture 88.8
retro 88.2
indoors 87.4
group 87.4
veil 85.7

Imagga
created on 2022-02-25

rule 52.1
measuring stick 40.3
measuring instrument 32.8
instrument 30.6
paper 29
negative 19.8
film 19.7
tool 17.9
retro 15.6
old 15.3
drawing 14.9
grunge 14.5
business 14
pencil 13.8
blank 13.7
design 13.2
vintage 13.2
aged 12.7
work 12.6
book 11.6
pen 11.6
photographic paper 11.5
plan 11.3
ancient 11.2
page 11.1
frame 11.1
note 11
letter 11
device 10.8
ruler 10.8
antique 10.4
empty 10.3
money 10.2
sketch 10.2
architecture 10.2
finance 10.1
symbol 10.1
message 10
tape 9.9
yellow 9.9
measurement 9.6
project 9.6
measure 9.6
poster 9.4
construction 9.4
document 9.3
hand 9.2
scale 9.1
texture 9
envelope 9
financial 8.9
office 8.8
book jacket 8.8
home 8.8
tools 8.5
space 8.5
sheet 8.5
number 8.4
house 8.4
technology 8.2
paint 8.1
brown 8.1
school 8.1
wealth 8.1
bank 8.1
science 8
equipment 8
art 7.8
photographic equipment 7.8
wall 7.7
dollar 7.4
investment 7.3
border 7.2
black 7.2
currency 7.2
building 7.1
notebook 7.1
information 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 98.8
person 91.5
drawing 90.3
human face 90.1
clothing 76
sketch 71.4
woman 61.4
picture frame 7.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 98.5%
Surprised 0.5%
Fear 0.4%
Angry 0.3%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Calm 0.1%

AWS Rekognition

Age 33-41
Gender Female, 100%
Calm 94.5%
Sad 2.2%
Angry 0.9%
Confused 0.9%
Happy 0.6%
Surprised 0.4%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 16-22
Gender Male, 60%
Calm 96.5%
Sad 2%
Fear 0.4%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%
Surprised 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Cat
Person 98.9%
Person 98.5%
Person 95.7%
Person 94.2%
Cat 70.3%

Categories

Text analysis

Amazon

188
133

Google

133 ...... ...... .... ...*..
133
......
....
...*..