Human Generated Data

Title

Untitled (girl)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1946

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (girl)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1946

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99
Human 99
Furniture 82.2
Clothing 77.6
Apparel 77.6
Electronics 69.9
Table 66.6
Screen 66
Monitor 62.8
Display 62.8
Flooring 62.6
LCD Screen 59
Desk 58.2

Clarifai
created on 2023-10-25

people 99.9
monochrome 98.1
adult 97.6
woman 97.2
street 97
girl 96.5
one 96.2
child 96.1
train 95
portrait 94.3
analogue 90.7
man 90.1
room 89.8
music 89.2
locomotive 87.4
transportation system 86.8
black and white 85.9
vehicle 85.9
indoors 85.4
two 84.7

Imagga
created on 2022-01-08

people 20.6
person 19.7
man 18.1
locker 17.9
adult 17.7
business 17.6
light 15.4
indoors 14.9
device 14.8
fastener 14.6
urban 14
modern 13.3
interior 13.3
office 13.2
happy 13.1
home 12.8
women 12.6
work 12.5
lifestyle 12.3
standing 12.2
black 11.9
portrait 11.6
city 11.6
window 11.5
chair 11.4
fashion 11.3
restraint 11.2
attractive 11.2
communication 10.9
male 10.8
table 10.8
umbrella 10.4
smiling 10.1
house 10
laptop 10
smile 10
worker 9.9
professional 9.7
room 9.7
success 9.6
architecture 9.5
corporate 9.4
happiness 9.4
glass 9.3
wall 9.1
silhouette 9.1
businesswoman 9.1
building 9
looking 8.8
sitting 8.6
businesspeople 8.5
holding 8.2
successful 8.2
cheerful 8.1
lady 8.1
brunette 7.8
expression 7.7
door 7.5
outdoors 7.5
inside 7.4
indoor 7.3
sexy 7.2
hair 7.1
working 7.1
businessman 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

black and white 93.3
person 92.7
clothing 90.2
text 84
monochrome 64.2
furniture 53.9
human face 53.5
girl 52.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 99%
Calm 91.2%
Sad 6.8%
Fear 1.1%
Happy 0.2%
Disgusted 0.2%
Angry 0.2%
Surprised 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%