Human Generated Data

Title

Untitled (girl with dolls standing next to lawn chair in yard)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12965

Human Generated Data

Title

Untitled (girl with dolls standing next to lawn chair in yard)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12965

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Clothing 89.8
Apparel 89.8
Food 87.6
Meal 87.6
Plant 83.7
Flower 64.4
Blossom 64.4
Tree 60.3
Face 59.7
Leisure Activities 59.7
Vacation 59.7
Picnic 59.7
Trash 59.6
Furniture 59
Overcoat 58.8
Coat 58.8
Suit 58.8
Flower Arrangement 55.8

Clarifai
created on 2019-11-16

people 99.9
group 98.8
adult 98.7
two 97.4
group together 97.4
one 96.7
many 95.9
vehicle 95.7
child 94.6
man 93.9
woman 92.7
recreation 90.8
three 90.4
wear 89.3
veil 88
four 86.7
home 86.3
several 85.7
canine 85.1
leader 84.9

Imagga
created on 2019-11-16

tricycle 47.8
wheeled vehicle 45.4
park 32.1
vehicle 31.4
snow 30.3
tree 27.3
bench 25.5
winter 24.7
cold 24.1
conveyance 22.6
outdoors 21.6
landscape 20.1
forest 20
trees 19.6
garden 16.4
outdoor 15.3
park bench 15
season 14.8
snowy 13.6
man 13.4
wood 13.3
seat 12.9
people 12.8
river 12.5
water 12
autumn 11.4
weather 11.4
path 11.3
summer 10.9
scenic 10.5
rubbish 10.3
male 10.2
ice 10.1
light 10
fall 10
child 9.8
person 9.8
old 9.8
rural 9.7
frost 9.6
woods 9.6
peaceful 9.2
travel 9.2
teenager 9.1
recreation 9
country 8.8
serenity 8.7
flowers 8.7
sitting 8.6
pretty 8.4
countryside 8.2
natural 8
family 8
grass 7.9
love 7.9
bucket 7.9
leaves 7.9
boy 7.8
black 7.8
frozen 7.6
walk 7.6
handcart 7.6
house 7.5
happy 7.5
stone 7.5
fun 7.5
leisure 7.5
plants 7.4
barrow 7.4
tourist 7.2
scenery 7.2
adult 7.1
portrait 7.1
mountain 7.1
spring 7.1
day 7.1
building 7.1
together 7
seasonal 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

tree 100
person 78.3
black and white 77.4
clothing 76.6
plant 71.7
park 70.2
text 65.7
white 60.5
grave 53.3
surrounded 16.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 4-12
Gender Female, 54.9%
Sad 45%
Disgusted 45%
Confused 45%
Surprised 45%
Happy 54.9%
Fear 45%
Angry 45%
Calm 45%

AWS Rekognition

Age 1-7
Gender Female, 50.3%
Angry 49.5%
Disgusted 49.5%
Fear 50.5%
Sad 49.5%
Happy 49.5%
Calm 49.5%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Google

CHIM
CHIM