Human Generated Data

Title

Untitled (girl sitting in small chair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17662

Human Generated Data

Title

Untitled (girl sitting in small chair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17662

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.7
Chair 99.6
Shoe 99.4
Footwear 99.4
Clothing 99.4
Apparel 99.4
Person 96.1
Human 96.1
Shoe 84
Floor 79.9
Blonde 76.1
Teen 76.1
Kid 76.1
Woman 76.1
Child 76.1
Girl 76.1
Female 76.1
Portrait 62.5
Face 62.5
Photography 62.5
Photo 62.5
Flooring 61.4

Clarifai
created on 2023-10-28

people 99.8
portrait 99.1
child 97.7
one 96.4
man 95.3
monochrome 94.7
adult 93.4
chair 92.5
boy 91.8
art 90.9
wear 89.6
girl 88.1
actor 87.9
model 86.6
self 86.5
street 84.4
room 84.2
son 82.7
analogue 81.6
shadow 79.7

Imagga
created on 2022-02-26

crutch 82.7
staff 64
stick 51.2
man 20.8
adult 20.2
cleaner 18.4
person 18.3
black 18.1
people 17.8
chair 16.5
male 16.3
men 15.4
women 13.4
one 13.4
city 13.3
room 13
fashion 12.8
indoors 12.3
equipment 12
wall 12
interior 11.5
human 11.2
street 11
business 10.9
urban 10.5
building 10.3
elegance 10.1
active 9.9
attractive 9.8
portrait 9.7
style 9.6
sexy 9.6
home 9.6
lifestyle 9.4
pretty 9.1
old 9.1
dress 9
furniture 8.8
happy 8.8
hair 8.7
clothing 8.6
smile 8.5
hand 8.4
alone 8.2
exercise 8.2
seat 8
worker 8
looking 8
body 8
working 8
work 7.9
standing 7.8
cleaning 7.8
model 7.8
sitting 7.7
health 7.6
walking 7.6
house 7.5
floor 7.4
window 7.4
sensuality 7.3
smiling 7.2
support 7.2
suit 7.2
device 7.2
helmet 7.1
posing 7.1
job 7.1
businessman 7.1

Microsoft
created on 2022-02-26

floor 97.5
text 94.2
holding 90.2
black and white 90
clothing 89.9
footwear 79.4
person 78.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 99.4%
Calm 57.2%
Surprised 36.6%
Sad 2%
Fear 1.1%
Disgusted 1%
Angry 0.9%
Confused 0.9%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Shoe
Person
Shoe 99.4%
Shoe 84%
Person 96.1%

Categories

Imagga

interior objects 97.2%
paintings art 2.2%

Text analysis

Amazon

a
YT33A2
MLN YT33A2 ОСНИ
MLN
ОСНИ

Google

MUNA YT3 A
MUNA
YT3
A