Human Generated Data

Title

Untitled (small boy wearing plaid pants, seated in wooden chair next to a radio)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18254

Human Generated Data

Title

Untitled (small boy wearing plaid pants, seated in wooden chair next to a radio)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18254

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.3
Human 99.3
Shoe 90.5
Clothing 90.5
Footwear 90.5
Apparel 90.5
Leisure Activities 77.4
Guitar 75.6
Musical Instrument 75.6
Sewing 75.4
Appliance 67
Shoe 62.5
Machine 60.9

Clarifai
created on 2023-10-22

people 99.9
child 99.3
one 98.3
furniture 96.9
room 96.4
music 95.2
chair 94
recreation 92.7
two 90.4
sit 89.7
adult 89.4
wear 89.2
portrait 89.2
musician 88.4
nostalgia 87.9
boy 86.7
family 86
seat 85.9
art 85.5
son 84.8

Imagga
created on 2022-03-04

shopping cart 50.5
chair 41.7
handcart 36.7
wheeled vehicle 28.3
seat 26.6
container 19.7
furniture 18.8
barber chair 18.8
man 17.5
shopping 17.4
cart 16.6
person 14.6
sitting 13.7
metal 13.7
business 13.4
negative 12.9
people 12.8
male 12.8
shop 12
outdoors 11.9
buy 11.3
empty 11.2
old 11.1
day 11
park 10.7
adult 10.5
film 10.2
lifestyle 10.1
relaxation 10
rocking chair 9.9
market 9.8
human 9.7
conveyance 9.7
urban 9.6
work 9.5
structure 9.5
retail 9.5
store 9.4
wall 9.4
summer 9
men 8.6
building 8.5
sale 8.3
leisure 8.3
holding 8.2
relaxing 8.2
businessman 7.9
basket 7.9
trolley 7.9
holiday 7.9
photographic paper 7.9
supermarket 7.8
black 7.8
money 7.7
furnishing 7.5
one 7.5
bench 7.5
wicker 7.2
suit 7.2
home 7.2
bank 7.2
architecture 7
modern 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 95.9
black and white 91.9
furniture 85.7
clothing 81
person 76.2
chair 68.7
table 60.3
monochrome 52.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 93.7%
Happy 93.9%
Fear 3%
Surprised 1.6%
Calm 0.8%
Angry 0.2%
Sad 0.2%
Disgusted 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.3%
Shoe 90.5%
Shoe 62.5%

Categories

Text analysis

Amazon

VAGOY
VI37A2 VAGOY
VI37A2

Google

EXTEEFDTE
EXTEEFDTE