Human Generated Data

Title

[Lyonel and Andreas Feininger inspecting a model yacht]

Date

1909?

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.675.30

Human Generated Data

Title

[Lyonel and Andreas Feininger inspecting a model yacht]

People

Artist: Unidentified Artist,

Date

1909?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.675.30

Machine Generated Data

Tags

Amazon
created on 2022-07-01

Person 98.5
Human 98.5
Person 97.9
Furniture 97.5
Chair 78.2
Clothing 67.3
Apparel 67.3
People 61.9

Imagga
created on 2022-07-01

rain barrel 38.7
vessel 34.7
cistern 31
man 24.2
home 23.9
pool 23.5
reservoir 23.2
room 22
ashcan 21.8
tub 21.4
person 20.2
people 20.1
bin 19.3
container 18.9
male 18.5
tank 18.2
happy 15
old 13.9
family 13.3
indoors 13.2
adult 13.1
couple 13
sitting 12.9
happiness 12.5
hospital 12.3
lifestyle 12.3
men 12
child 11.9
smiling 11.6
bathtub 11.5
interior 10.6
bathroom 10
vintage 9.9
holding 9.9
human 9.7
portrait 9.7
love 9.5
clean 9.2
leisure 9.1
chair 9.1
care 9
health 9
outdoors 8.9
cheerful 8.9
women 8.7
antique 8.6
smile 8.5
face 8.5
two 8.5
senior 8.4
house 8.3
mother 8.1
domestic 8.1
washing 7.8
casual 7.6
hand 7.6
toilet 7.6
fun 7.5
clothing 7.4
indoor 7.3
metal 7.2
dress 7.2
kid 7.1
furniture 7.1

Google
created on 2022-07-01

White 92.2
Black 89.9
Black-and-white 86.6
Style 84.1
Window 82.7
Table 77.8
Monochrome photography 75.5
Monochrome 73.6
Vintage clothing 69
Room 67.1
Water 66.9
Sitting 62.5
Glass 57.9
Chair 52.2

Microsoft
created on 2022-07-01

person 97.8
black and white 84.8
human face 71.1
clothing 69.8
meal 17.1

Color Analysis

Feature analysis

Amazon

Person 98.5%

Categories