Human Generated Data

Title

Untitled (young man and woman standing by jukebox)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2394

Human Generated Data

Title

Untitled (young man and woman standing by jukebox)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2394

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.6
Human 99.6
Person 99.5
Clothing 93.2
Apparel 93.2
Shorts 82.5
Female 82.4
Room 81.9
Indoors 81.9
Furniture 80.5
Dress 70.6
Girl 69.9
Shoe 65.9
Footwear 65.9
Woman 64.9
Hair 58.6
Shoe 50

Clarifai
created on 2023-10-28

people 99.9
monochrome 99.3
adult 99.1
wear 98.6
woman 98.6
one 97.6
two 97.4
laundry 93.8
portrait 92
newspaper 88.2
facial expression 88.2
commerce 86.7
girl 85.9
man 85.4
music 85.3
retro 84.9
dress 84.6
street 83.9
group 81.9
family 81.1

Imagga
created on 2022-01-30

shopping cart 100
handcart 100
wheeled vehicle 89.3
container 72.8
shopping 50.5
cart 45.8
basket 36.5
buy 35.7
shop 33.7
sale 32.4
market 30.2
conveyance 29.8
supermarket 29.6
trolley 27.6
store 26.5
retail 22.8
metal 22.5
purchase 20.2
empty 18.9
buying 18.3
business 17.6
push 17.1
shopping basket 17
trade 15.3
object 14.7
commerce 14
customer 13.3
checkout 12.8
people 12.8
money 12.8
grocery 12.6
wheel 12.3
commercial 12.2
man 12.1
metallic 12
pushcart 11.9
mall 11.7
3d 11.6
steel 10.6
equipment 10.2
person 10.2
symbol 10.1
consumer 9.7
chrome 9.4
finance 9.3
male 9.2
adult 9.1
consumption 8.9
dishwasher 8.8
silver 8.8
conceptual 8.8
trading 8.8
pay 8.6
holding 8.3
transport 8.2
technology 8.2
home 8
purchasing 7.9
holiday 7.9
collect 7.9
e commerce 7.8
carry 7.8
carrying 7.8
gift 7.7
sell 7.7
men 7.7
sitting 7.7
plastic 7.3
cash 7.3
full 7.3
lady 7.3
smiling 7.2
lifestyle 7.2
white goods 7.2
building 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

person 94.5
black and white 87.8
text 87.3
outdoor 87.2
clothing 79.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 98.9%
Happy 87.7%
Calm 8.5%
Sad 1.3%
Confused 0.7%
Disgusted 0.5%
Angry 0.5%
Fear 0.4%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 99.5%
Shoe 65.9%
Shoe 50%

Categories

Imagga

paintings art 99.3%

Text analysis

Amazon

an

Google

YT3RA2-XAGO Sectur
YT3RA2-XAGO
Sectur