Human Generated Data

Title

Untitled (four children at picnic table, Mills children, NH)

Date

August 1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18088

Human Generated Data

Title

Untitled (four children at picnic table, Mills children, NH)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

August 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18088

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.2
Human 99.2
Person 98.7
Person 97.7
Clothing 91.6
Apparel 91.6
Person 91.2
Furniture 81.2
Outdoors 65.4
Sunglasses 62.7
Accessories 62.7
Accessory 62.7
Vehicle 60.4
Transportation 60.4
People 60.3
Sitting 59.8
Suit 59.5
Overcoat 59.5
Coat 59.5
Silhouette 59.4
Nature 58
Piano 56.4
Musical Instrument 56.4
Leisure Activities 56.4
Hat 56.2
Bench 55.8

Clarifai
created on 2023-10-29

people 99.8
adult 99
group together 98.2
group 98.2
woman 98.1
vehicle 97.2
man 96.2
recreation 95.4
child 95.1
transportation system 93.9
wear 92.4
furniture 91.6
many 91.3
five 91.3
leader 89.4
bench 89.4
sitting 89.2
four 89.1
two 88
several 87.8

Imagga
created on 2022-03-04

bench 49.4
park bench 46.4
man 32.2
seat 30.7
laptop 26.8
people 26.2
adult 22
male 22
outdoors 21.7
sitting 21.5
person 20.1
happy 20
furniture 19.5
cheerful 18.7
work 18.3
smile 17.8
lifestyle 17.3
business 17
computer 16.6
musical instrument 16.4
couple 15.7
outdoor 15.3
together 14.9
technology 14.8
attractive 14.7
women 14.2
keyboard instrument 13.4
park 13.4
notebook 13.3
accordion 13.3
men 12.9
love 12.6
smiling 12.3
fun 12
businesswoman 11.8
happiness 11.7
group 11.3
friends 11.3
pretty 11.2
corporate 11.2
two 11
portrait 11
communication 10.9
leisure 10.8
cute 10.8
office 10.7
family 10.7
working 10.6
looking 10.4
youth 10.2
day 10.2
child 10.2
casual 10.2
worker 9.8
businessman 9.7
black 9.6
furnishing 9.6
sit 9.5
professional 9.4
student 9.1
handsome 8.9
spectator 8.9
success 8.8
full length 8.7
education 8.7
table 8.6
togetherness 8.5
friendship 8.4
modern 8.4
wind instrument 8.4
joy 8.3
television 8.3
executive 8.3
teen 8.3
one 8.2
teenager 8.2
lady 8.1
home 8
summer 7.7
jeans 7.6
hand 7.6
relax 7.6
human 7.5
car 7.5
chair 7.3
transportation 7.2
team 7.2
vehicle 7.1
autumn 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 99.2
black and white 92.2
person 91.8
street 87.9
text 83.9
clothing 83.3
man 58.9
monochrome 58.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 59.3%
Happy 92.3%
Calm 6.1%
Surprised 0.4%
Sad 0.4%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 37-45
Gender Male, 89.6%
Calm 99.4%
Sad 0.3%
Fear 0.2%
Disgusted 0%
Confused 0%
Happy 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 9-17
Gender Male, 65.3%
Happy 58.4%
Sad 28.3%
Calm 6.7%
Confused 3.1%
Disgusted 1.1%
Fear 1%
Surprised 0.7%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Sunglasses
Piano
Person 99.2%
Person 98.7%
Person 97.7%
Person 91.2%
Sunglasses 62.7%
Piano 56.4%

Categories

Text analysis

Amazon

MJ17--YT3RA
KAGOX

Google

YT37A° NAGO
YT37A°
NAGO