Human Generated Data

Title

Untitled (musicians in ballroom)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19241

Human Generated Data

Title

Untitled (musicians in ballroom)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19241

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.6
Furniture 99.6
Person 98.4
Human 98.4
Person 97.4
Musician 97
Musical Instrument 97
Chair 93.8
Chair 89.3
Leisure Activities 88.9
Music Band 81.9
Guitar 79.1
Chair 70.4
Guitarist 70.4
Performer 70.4
Person 65.8
Chair 61.6
Chair 61.3
Sitting 60
Cello 59.3
Photography 55.9
Photo 55.9
Chair 50.9

Clarifai
created on 2023-10-22

people 98.4
chair 98
man 96.3
adult 94.5
monochrome 92.8
sit 92
furniture 91.8
woman 89.2
seat 86.9
wear 86.3
group 85.1
sitting 82.4
table 82
group together 80.9
young 79.2
indoors 78.7
recreation 78.2
business 76.7
family 76.6
wait 75

Imagga
created on 2022-03-05

crutch 55
staff 43.5
stick 32.4
people 22.3
person 20
man 16.1
adult 16.1
beach 15.3
outdoors 14.9
silhouette 14.1
chair 13.3
male 12.8
water 12.7
women 12.6
sketch 12.4
summer 12.2
sky 12.1
ocean 11.6
brass 11.4
sand 11.3
attractive 11.2
body 11.2
men 11.2
drawing 11
sea 10.9
leisure 10.8
human 10.5
day 10.2
lifestyle 10.1
clothing 10
portrait 9.7
black 9.6
couple 9.6
wind instrument 9.4
travel 9.1
life 9.1
vacation 9
group 8.9
light 8.7
outside 8.6
business 8.5
sport 8.4
building 8.4
fashion 8.3
landscape 8.2
sunset 8.1
activity 8.1
sun 8
art 7.9
device 7.9
model 7.8
sitting 7.7
outdoor 7.6
happy 7.5
city 7.5
lady 7.3
smiling 7.2
sexy 7.2
hair 7.1
musical instrument 7
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.2
outdoor 90.2
black and white 80.9
posing 77.8
clothing 70.5
person 69
old 65.7

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Chair 99.6%
Chair 93.8%
Chair 89.3%
Chair 70.4%
Chair 61.6%
Chair 61.3%
Chair 50.9%
Person 98.4%
Person 97.4%
Person 65.8%

Categories

Imagga

paintings art 96.4%
text visuals 1.8%

Text analysis

Amazon

6
tirm
KAOO
SWEETA tirm
3
LAGOX
SWEETA

Google

MJI VT O0D
MJI
VT
O0D