Human Generated Data

Title

Untitled (three women and one man at Grange meeting, displaying produce and preserves)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18272

Human Generated Data

Title

Untitled (three women and one man at Grange meeting, displaying produce and preserves)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18272

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.4
Human 99.4
Person 98.7
Person 98.5
Person 97.9
Clothing 86.3
Apparel 86.3
Building 80.6
Face 77.5
Housing 77.5
Table 63.7
Furniture 63.7
Outdoors 61.2
Female 60.5
Shorts 59.6
Statue 58.8
Art 58.8
Sculpture 58.8
Mammal 58.8
Animal 58.8
Sitting 56.9
Countryside 55.1
Nature 55.1

Clarifai
created on 2023-10-22

people 100
group together 99.7
group 99.1
adult 99.1
man 97.9
several 96.1
two 95.8
woman 95.4
furniture 94.7
three 93.8
many 92
four 91.9
recreation 91.1
instrument 91.1
music 90.6
monochrome 90.4
child 90.3
leader 90.2
five 88.4
wear 87.3

Imagga
created on 2022-03-04

marimba 100
percussion instrument 100
musical instrument 94.3
people 26.2
man 25.5
sitting 23.2
male 21.3
person 20.2
lifestyle 17.3
women 15.8
happy 15.6
outdoors 14.9
adult 14.6
together 13.1
couple 13.1
cheerful 13
portrait 12.9
computer 12.8
business 12.7
day 12.5
bench 12.1
smiling 11.6
businessman 11.5
table 11.3
education 11.2
attractive 11.2
men 11.2
happiness 11
group 10.5
love 10.2
two 10.2
laptop 10
smile 10
students 9.7
vibraphone 9.7
office 9.6
chair 9.6
black 9.6
work 9.4
friends 9.4
back 9.2
holding 9.1
park 9
student 9
room 9
lady 8.9
working 8.8
boy 8.7
school 8.6
youth 8.5
sit 8.5
communication 8.4
relaxation 8.4
mature 8.4
old 8.4
silhouette 8.3
teenager 8.2
girls 8.2
technology 8.2
suit 8.1
home 8
indoors 7.9
casual 7.6
talking 7.6
relax 7.6
friendship 7.5
fun 7.5
leisure 7.5
one 7.5
study 7.4
teen 7.3
water 7.3
copy space 7.2

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 95.2
outdoor 94.1
table 93.3
person 91.8
clothing 91.4
furniture 90
black and white 84.4
black 80
white 71.2
old 68.7
chair 56.4
man 55.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 70.9%
Calm 99.9%
Sad 0.1%
Confused 0%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 99.7%
Calm 99.7%
Surprised 0.1%
Sad 0%
Happy 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 98.7%
Person 98.5%
Person 97.9%

Text analysis

Amazon

37
NO 37
NO
USA
NUTFIELD
OWN
LD
OWN IN
NUTFIELD GRANGE
IN
GRANGE
USA IN 1719
1719
FIRST
FIRST POTAT
POTAT

Google

NUTFIELD GRANGE NO 37 FIRST POTAT OWN IN USA IN 1719 LD
NUTFIELD
GRANGE
NO
37
FIRST
POTAT
OWN
IN
USA
1719
LD