Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (girl and boy next to fence)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17780

Human Generated Data

Title

Untitled (girl and boy next to fence)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17780

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 97.6
Face 92.8
Weather 80.7
Nature 80.7
Kid 80.1
Child 80.1
Urban 79.8
Outdoors 78.7
Clothing 77.3
Apparel 77.3
Girl 75
Female 75
Portrait 72.3
Photography 72.3
Photo 72.3
City 64.6
Building 64.6
Town 64.6
Pants 63.5
Boy 63
People 59.7
Play 57.2
Standing 55.3

Clarifai
created on 2023-10-29

child 99.7
monochrome 99.4
people 98.7
baby 97.9
love 97.5
son 97
portrait 96.6
family 96.5
girl 94.1
two 93.9
boy 91.7
man 91.2
fun 90.3
sepia 89.3
woman 89.2
cute 88.3
black and white 87.6
little 85.5
affection 85.4
adult 83.3

Imagga
created on 2022-02-26

child 52.1
brother 40.2
family 29.4
parent 29
happy 27.6
people 27.3
person 25.5
mother 24.9
love 24.5
kid 23.9
boy 22.6
male 21.3
portrait 20.7
smile 20.7
happiness 20.4
father 19.8
childhood 19.7
man 19.5
fun 19.5
adult 19.4
together 19.3
smiling 18.8
park 18.2
face 17.8
outdoors 17.2
children 16.4
lifestyle 15.9
little 15.9
cute 15.8
son 15.4
summer 14.8
youth 14.5
leisure 14.1
outdoor 13.8
one 13.4
joy 13.4
dad 13.3
couple 13.1
cheerful 13
playing 12.8
senior 12.2
black 12.1
sibling 11.5
lady 11.4
blond 11.3
play 11.2
pretty 11.2
hair 11.1
grass 11.1
two 11
day 11
relaxation 10.9
looking 10.4
togetherness 10.4
outside 10.3
girls 10
hand 9.9
attractive 9.8
human 9.8
hands 9.6
women 9.5
eyes 9.5
baby 9.1
holding 9.1
world 9
grandma 8.9
sun 8.9
sexy 8.8
daughter 8.8
affectionate 8.7
holiday 8.6
laughing 8.5
sport 8.5
beach 8.4
vacation 8.2
water 8
active 8
embracing 7.8
innocent 7.8
tree 7.7
expression 7.7
playful 7.6
fashion 7.5
kids 7.5
copy space 7.2
body 7.2
recreation 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98
outdoor 94.7
person 94.5
toddler 92.2
human face 91.5
clothing 90.5
black and white 84.8
boy 80.4
baby 75.7
smile 59.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Female, 99.8%
Calm 79.3%
Sad 11.2%
Surprised 3.3%
Confused 1.8%
Happy 1.4%
Angry 1.3%
Disgusted 0.9%
Fear 0.8%

Feature analysis

Amazon

Person
Person 99.2%

Captions

Microsoft
created on 2022-02-26

a man holding a kite 26.9%

Text analysis

Amazon

a
MJIR
MJIR УТАЛАЗ
УТАЛАЗ

Google

MJI7 VT 02 002
MJI7
VT
02
002